Warning: strpos() [function.strpos]: needle is not a string or an integer in /nfs/c01/h08/mnt/32378/domains/data-tribe.net/html/wework4her/index.php on line 41

Warning: strpos() [function.strpos]: needle is not a string or an integer in /nfs/c01/h08/mnt/32378/domains/data-tribe.net/html/wework4her/index.php on line 48

Warning: strpos() [function.strpos]: needle is not a string or an integer in /nfs/c01/h08/mnt/32378/domains/data-tribe.net/html/wework4her/index.php on line 55

Warning: strpos() [function.strpos]: needle is not a string or an integer in /nfs/c01/h08/mnt/32378/domains/data-tribe.net/html/wework4her/index.php on line 61
trabajamos para ella ! - Computation aided /driven 'design process' : research , failure , hueristics , 'versioning', legacy.
Computation aided /driven 'design process' : research , failure , hueristics , 'versioning', legacy. 
Sunday, February 10, 2008, 07:47 AM - Research, papers
Posted by Administrator
video is a slideshow
A stub post following from previous post:An attempt to collect sketches that were discarded, put on the back-burner etc. To be expanded into an attempt to theorize/collate the processes of failure,accidental discovery & legacy within computation driven 'design' processes.

A free player from tradebit.com

A collection of recent sketches following from a research interest in polygonal geometry: its finite-element data structure, sub-Division schema, innate algorithmic nature of poly-modelling tools, barycentric coordinates and global continuity, 'greebling' and other possibilities for surface articulations, etc.

2 comments ( 3 views )   |  0 trackbacks   |  permalink   |   ( 2.9 / 512 )

In Lieu of a Manifesto :Experiments, Hueristics and Praxis 
Wednesday, February 6, 2008, 07:56 PM - Analytical, Research, papers
Posted by Administrator
The following post is abridged from a unpublished research paper that collates personal experiences related to 'architectural-computing' as part of team Manifold,AADRL, Architectural Assistant at HOK Sport London, and as Architectural Assistant at ZHA, London . However a niggling 'computational' task warranted the post. videos are in real time.

Ted Kruger, in his lecture series, Instrument and Instrumentality, uses Herbert Simon’s distinctions between ‘natural’ and ‘artificial’ sciences to describe the ‘sciences’ as operating on two agendas: understanding the world ‘as-is’ and speculating on ‘as-it should be’.

“Herbert Simon writing in Sciences of the Artificial posits two kinds of science the ‘natural’ and the ‘artificial’. Natural sciences, such as physics, chemistry and biology endeavour to understand the world ‘as it is’. The task is fundamentally descriptive and analytical. It concerns itself with thinking. Sciences of the artificial - business, engineering, and all of design, for examples, give primary consideration to the world‘as it should be’….”

Kruger, Ted: Instrument and Instrumentality.
V2_Lab; Workshops, lecture Ted Krueger,<www. lab.v2.nl/events/_docs/lecture_krueger.pdf>, pg 1.

Subscription to and extension of the argument would mean that ‘applied science’ could be posited as the bridge between the two. Further, (architectural) ‘design as research’ could be argued to exhibit similar properties of using, translating, transposing and adapting the descriptive tools of natural science to engineer an imagined and wished world.

_geometric relations, logical structures: parametric systems

Architecture as an assembly of design components implicitly suggests ‘self-similarity’ and ‘difference’ amongst the parts of the system. Particulars or aspects of this similarity and difference could be multitude. Nonetheless, our research aims to ‘appose’ and /or articulate the notion with other issues that deserve to be discussed.

In recent ‘parametric’ systems of design and of codification of design for construction, this aspect of similarity and difference, have tended to be manifested in the production of ‘homeo-morphic’ geometries such as ‘adaptive (to curvature) tiling’ patterns, structural skins etc. Alternatively, there have been systems, like those in robotics , that concentrated on production of non-topological difference and on aspects of ‘family’ and ‘individual’ on the basis of ‘attributes’ such as constituent parts , specific task-capacities, transformation pathways etc. Without explicit allegiance to one or the other, our research gravitates more towards using the first principles of all such systems : that of algorithmic procedures, machinating ‘control rigs’ where all pre-defined logical relations remain consistent as the system is manipulated or re-evaluated, use of external data as strategic inputs, feedback loops etc. Thus, parametric systems can be understood as systems of and/or related to quantification, determination of logical relations and continuous evaluation of a system so ‘rigged’ for multiple design outcomes.

Empirical evidence from (within our current context) of a multi-tiered, plural, and collaborative design process and environment implores for further articulation of one aspect of the definition (in bold above). A possible method of such an articulation, as will be argued in the example below, is through ‘rules-of-thumb’ or an abstraction of information and simplification of the ‘intelligence’ of parametric systems.

A free player from tradebit.com

A free player from tradebit.com

The associated images and video relate to an attempt to ‘parametrisize ’ the effect of built environment on the visual ‘field’ of its context and vice versa. The attempt was to develop a interactive ‘modelling’ tool which would re-evaluate a simple iterative logic upon manipulation of the model: shoot ‘rays’ from every point of a input set of ‘sampling’ points in pre-defined directions, calculate the ratio of the number of rays shot to those intersected by the built mass, and collate the distances at which those intersections occurred. Empirical experience would however point to a computational limit: calculation of intersection of rays with ‘mesh’ geometry, consumes a lot of computational resource and thus slowing down execution times. This, in spite of the code predominantly being wrappers to pre-compiled, professional code (from Autodesk Maya) and accounting for the amateur, enthusiast nature of the attempt. This then, points to significant aspect of ‘parametric’ systems: that of ‘re-evaluation time’ of a ‘parametric’ setup.

A free player from tradebit.com

Elaborating, most ‘parametric’ platforms ((Rhino explicit history, Generative Components, Maya Hyper-graph, Catia object- tree, rendering Shader-networks, et.al)) have to account for, in some form or manner, information flow between constituent ‘nodes’ : black-boxes that take one/multiple inputs and produce an output, which is further connected as ‘input’ to other such ‘nodes’. It then logically, and presumably from postulates of graph theory, follows that ‘evaluation time’ of such a set-up has to be dependent directly on the number of ‘nodes’;

It can be argued further that significant ‘time-gains’ can be effected only through more efficient, professional algorithmic and data structures. In this particular example for instance, there are better and professional (stand-alone) tools available that implement efficient algorithms relating to Geo-spatial analysis. (e.g: Space syntax Isovist, Matlab line of sight et al).

These aspects of ‘parametric’ systems or ‘architectural computing’, coupled with the nature of ‘multi-tiered, plural, and collaborative design process and environment’ outlined previously, point towards a need for ‘rules-of thumb’. Delineating further, these ‘rules’ need to abstract, encapsulate, and allow for the potential to integrate with, more advanced computing logics of science, engineering and construction. This would allow for an informed-yet-fluent design process without placing undue burden on the later stage of integration with ‘computation’ for construction.

Stated differently, most ‘scripting’ and other forms of ‘end-user programming’ within design realms, would be best served as an interface between opening up possibilities in the design realm, and enabling the manifestation of those possibilities in built form.

In the current example, this ‘rule-of-thumb’ was crudely instantiated as a reduction in the number of rays being shot, in order to approximate the result and enable ‘interactive’ feedback…

…More examples and substantiations to the argument to follow.

v1.0 non interactive tool but with shader support

2 comments ( 3 views )   |  0 trackbacks   |  permalink   |   ( 2.9 / 576 )

Monday, February 4, 2008, 12:45 PM
Posted by Administrator
video is in real-time
coming soon...

A free player from tradebit.com

coming soon...

4 comments ( 4 views )   |  0 trackbacks   |  permalink   |   ( 2.9 / 425 )

Interactive UVN transposition : Maya 
Wednesday, January 30, 2008, 06:37 PM - Maya.c++.api., Geometry, Tessellations
Posted by Administrator
videos are in real-time
Shows attempts to create an interactive tool that transfers meshes from the UVN space of a reference surface(usually flat, :. uvn = xyz) to that of another.

'Uses' (as seen in videos) relate to 'flowing' multiple components on a 'host' surface, transferring a user-generated triangulated / quad mesh unto a given surface etc.

Second video also includes the use of Maya-Qhull interface, to generate meshes.

5 comments ( 31 views )   |  0 trackbacks   |  permalink   |   ( 2.9 / 593 )

Maya fluid and its field of force 
Wednesday, January 30, 2008, 05:45 PM - Maya.c++.api., algorithms
Posted by Administrator
video is in real-time
Shows a visualisation (of the data-set) of the force field around a Maya 'fluid' simulation.

A free player from tradebit.com

It is a well-known fact that the 'fluid' simulation yields a gradient numeric data-set that can be used to parametrically drive other geometric, and/or organizational systems.

In a bid to further research of fluid-fields and its use within parametric 'design' systems, we attempted to extract more data from the simulation. Here a 'fluidNode' (a simple wrapper to API methods) is being used to visualize/extract data related to the force exerted by a 'fluid' on point-objects within it.

Videos and images relate to how this might be applied to 'differentiate' and/or generate geometry.

A free player from tradebit.com

A free player from tradebit.com

6 comments ( 32 views )   |  0 trackbacks   |  permalink   |   ( 3 / 501 )

real-time curvature information in Maya viewports. 
Thursday, January 17, 2008, 10:17 AM - Maya.c++.api., Maya.general.modelling, Analytical
Posted by Administrator
video is in real-time
Shows an application of the openGL generic-data visualising node. (see previous entry)

The node is being used to visualise 'curvature' under 'real-time' manipulation of a NURBS surface.

Currently supports only one surface at a time. V 1.0, based on a hardware shader node however, supports multiple objects.

Curvature values are computed from the 'derivative' information that the maya API provides access to. As such the numeric results are accurate (verified by comparision in Rhino). Only gaussian and mean curvatures curvatures can be displayed.

The color regions are dynamic divisions of the min-max range, which can be set.

click for more
click for more

7 comments ( 17 views )   |  0 trackbacks   |  permalink   |   ( 2.9 / 635 )

MEL and openGL visualization. 
Tuesday, January 15, 2008, 03:31 PM - Maya.c++.api., Maya Embedded Language, utilities, programming interests
Posted by Administrator
video is in real-time
A quick screen grab from WIP.Shows a prototype 'node' for Maya that can take MEL inputs and produce openGL entities.

The in-built blind data editor(polyColorBlindData MEL command) provides a handy interface to visualize information embedded within a mesh. The attempt is to build a similar node that can visualize more generic blind-data. It is a simple wrapper around the easy access that the maya API provides to incorporate openGL in its view ports.

Currently it can take input surfaces and meshes, vector (point and color) arrays. Output modes are points, and lines. The intention is to support quads and line strings as well.

This might help some of our previous tools to work in interactive modes, by obviating the need to make 'intermediate geometry'. Standard uses could include visualizing surface curvature, 'input sites' on a surface, closest-fit planes, point clouds etc.

3 comments ( 12 views )   |  0 trackbacks   |  permalink   |   ( 2.9 / 636 )

Maya surface rationalization 
Friday, January 11, 2008, 11:35 AM - Maya.c++.api., Maya Embedded Language, Analytical
Posted by Administrator
video is in real-time
A typical work-flow with the tool-set developed (for maya) to estimate the number of panels of single-double and infinite curvature needed to span a single NURBS surface without compromising the discernible 'visual' continuity of the surface.

A free player from tradebit.com

click to read more
4 comments ( 11 views )   |  0 trackbacks   |  permalink   |  related link   |   ( 2.9 / 664 )

Weblog under construction 
Thursday, January 3, 2008, 08:11 PM - Maya.c++.api., Maya.general.modelling
Posted by Administrator
image: compettetion entry for drlX pavilion design: team included: luis fraguada, victor orive, and brian dale.

thank you for your patience.

10 comments ( 15 views )   |  0 trackbacks   |  permalink   |   ( 2.9 / 642 )

MEL _ Maya API _ comparision video 
Thursday, January 3, 2008, 10:59 AM - Maya.c++.api.
Posted by Administrator
video: video is in real-time.
comparision of relative times consumed by code written in three formats, and performing the same task - a platform specific implementation of Prim's algorithm

left to right in video :
compiled c++ plug-in, MEL script augmented with a C++ plug-in for the hardest part of the code, and entirely MEL script.

In an attempt to 'justify' the spending of learning time , on the C++ API side of things in Maya, we conducted the following test :

1. chose / wrote (using MEL) a processor-intensive algorithm that had minimal geometry creation, . Prim's algorithm proved to be quite handy in this regard: the processor requirements of the algorithm seems to grow quite rapidly with the number of elements.

2. Identified the principal bottleneck within in the algorithm , in regard to execution times. Wrote a c++ Maya plug-in for that part.

3. Transposed the entire algorithm into c++;

The windows in the video above represent the these three steps from right to left.

a. sample mesh was a 'voronoi' mesh(generated using qhull) of 100 random points.
b. Approx 1300 edges, needing about 850 iterations(for this particular mesh) to find the minimal-spanning-tree.
c. c++ plugin tool took approx 16 seconds to complete the task, the augmented MEl script took Approx 4 minutes and the MEL script completed about 80% of the task in about 8 Minutes.

development time:
MEL : 0.5 days;
Augmented MEL : 0.5 days;
Plug-in : 1.0 days.
(It has to be said that we are in the very early days of learning the Maya API.)

Note : The c++ code inside plug-in(on the right) has a fair amount inefficient structure, in that it uses a large amount of 'string' operations, and duplicates some amount of work between its constituent parts. The MEL script on the other hand, is quite efficient in that it relies almost entirely on MEL commands, and uses only one additional MEL procedure.

More details about the platform specific implementation of the algorithm to be posted soon.

3 comments ( 8 views )   |  0 trackbacks   |  permalink   |   ( 3 / 604 )

<<First <Back | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | Next> Last>>