Friday, November 12, 2010
New Regime of Fineness
Source: http://en.wikipedia.org/wiki/Monadology
1.^ Leibniz G., The Monadology translated by George MacDonald Ross, 1999
2.^ Leibniz G., New System §11
Image: Controlled-NOT Molecular Gates of NMR-type Quantum Computer
http://www.carolla.com/quantum/QuantumComputers.htm
Thursday, November 4, 2010
Natural Engine
The Deep Impact probe (now on a mission called EPOXI) passed by comet Hartley 2 at 7:01 a.m. PDT... The probe flew through the comet’s diffuse corona at about 27,500 miles per hour and came within 435 miles of its icy, dirty core.
Source: http://www.wired.com/wiredscience/2010/11/epoxi-comet-flyby/
Monday, August 23, 2010
Design of Design
Fred Brooks
Sunday, August 22, 2010
Interstitial Mind
Ecosystem of Bodies
1- F. Nietzsche, The Will to Power, s.636, Walter Kaufmann transl.
Image: Formation of intensity in Mamatus Clouds. Jorn C. Olsen 2004
Sunday, August 15, 2010
Hets
Wednesday, July 7, 2010
Atlantic Architecture
Friday, July 2, 2010
Speed Merchants
Friday, June 25, 2010
Mapping Free Will
Sunday, June 20, 2010
Optimism
'Artistic creation is by definition a denial of death. Therefore it is optimistic, even if in an ultimate sense the artist is tragic.’
Andrei Tarkovsky, Time Within Time: The Diaries 1970-1986, translated by Kitty Hunter-Blair, London, 1994
The Screen
What Is an Event? by Gilles Deleuze from the Fold, Leibniz and the Baroque, translated by Tom Conley, the University of Minnesota Press, 1992.
image: Matthias Dittrich music visualization Java-Applet http://www.matthiasdittrich.com/projekte/narratives/applet/index.html
Saturday, June 19, 2010
Eventalization
M. Foucault, ‘Impossible Prison’ [1980] in Foucault Live, 1996, p. 277
Saturday, June 12, 2010
Photon Farm
A photon, frail by itself as a source of thrust yet empowering and essential to the terrestrial ecosystem by its contribution to the photosynthetic process, makes its electromagnetic qualities visible through macroscopic effects. Scientists have demonstrated here that photons are capable of displaying particle qualities by harnessing its power in a swarm state.
Mathematics of Hunting
Wednesday, June 2, 2010
Jeff Koons BMW art-car
Read More http://www.wired.com/autopia/2010/06/jeff-koons-art-car-doesnt-suck/#ixzz0plbu9GoM
Tuesday, May 11, 2010
Affective Computing
Marvin Minsky, one of the pioneering computer scientists in artificial intelligence, relates emotions to the broader issues of machine intelligence stating in The Emotion Machine that emotion is "not especially different from the processes that we call 'thinking.'"[8]
Minsky argues that emotions are different ways to think that our mind uses to increase our intelligence. He challenges the distinction between emotions and other kinds of thinking. His main argument is that emotions are "ways to think" for different "problem types" that exist in the world. The brain has rule-based mechanism (selectors) that turns on emotions to deal with various problems.
Source: en.wikipedia.org
Image: Fearful face recognition after placebo infusion versus neutral face recognition after placebo infusion. WOEXP: 475. http://neuro.imm.dtu.dk/services/jerne/brede/WOEXP_475.html
Wednesday, May 5, 2010
Linux vs. Genome
A comparison of the networks formed by genetic code and the Linux operating system has given insight into the fundamental differences between biological and computational programming.
The shapes are very dissimilar, reflecting the evolutionary parameters of each process. Biology is driven by random mutations and natural selection. Software is an act of intelligent design.
“One of the biggest problems of biological data is that you have no intuitions about it. It’s just a bunch of gobbledygook symbols. One way to get intuition is to map its structure onto something we know about,” said study co-author and Yale University informaticist Marc Gerstein. “Linux is evolving and changing. But unlike evolution in biology, we know exactly what’s going on.”
Several years ago, he refined a technique for turning gene-network “hairballs” — densely tangled depictions of gene interaction — into hierarchical maps. At the top of each map are what Gerstein calls master regulators, which steer the activity of many other genes. At the bottom are workhorses, which pump out protein code. In between are the middle managers, which do a bit of both.
Since then, Gerstein has compared the structure of gene networks between species, and contrasted biological networks with corporate and governmental structures. He hopes the contrasts will illuminate how network structure shapes genomic function.
In the latest study, published April 4 in the Proceedings of the National Academy of Sciences, he compared the genome of E. coli, a widely studied microbe, to Linux, the popular open source operating system. Though Gerstein hoped for insight into biological networks, the study also suggests strategies for social and technological engineers.
“If we don’t have designers fine-tuning things, and we have to deal with random changes, then what do we need to do in the control structure to make it robust?” said Gerstein.
E. coli’s network proved to have a pyramid-like shape, with a few master regulators, more middle managers, and many workhorses. In stark contrast, the Linux kernel call graph — the network of interactions between different pieces of program code — looks almost like an inverted pyramid. A great many top-level programs call on a few common subroutines.
Gene network structures start to resemble the Linux call graph as species become more complex, according to Sergei Maslov, a Brookhaven National Laboratory systems biologist not involved in the study. However, their pyramids never become as top-heavy as Linux. There seems to be a natural limit to this progression. The new study suggests why.
“If you update a low-level function, then you need to update all the functions that use it. That’s doable if you’re an engineer. You just go through all the code. But it’s impossible in biology,” Maslov said.
Indeed, when Gerstein’s team tracked the evolution of Linux kernel code since its original 1991 version, they found that its basic components had undergone extensive alteration. Biologically analagous are so-called evolutionarily conserved genes, which are used in a great many functions, but these have hardly changed at all. When a mutation is added, evolution can’t quickly update the rest of the genetic code.
Asked if human software engineers have outpaced natural evolution, Gerstein said the opposite was true. The computer model may be so extreme that it can’t be scaled to biological levels of complexity. “You can easily see why software systems might be fragile, and biological systems robust. Biological networks are built to adapt to random changes. They’re lessons on how to construct something that can change and evolve,” said Gerstein.
For now, the researchers have no plans to compare genomes to the most widely-used operating system of all, Windows.
“That’s forbidden,” said study co-author and Stony Brook University biophysicist Koon-Kiu Yan. “Windows is not open source.”
Image: Network structures of E. coli genome and Linux./PNAS.
Read More http://www.wired.com/wiredscience/2010/05/linux-vs-life/#ixzz0n7JJmx81
Wednesday, April 28, 2010
clocks
Wednesday, March 3, 2010
Continental Bodies
Monday, January 25, 2010
Self-Optimization
The researchers then borrowed simple properties from the slime mold’s behavior to create a biology-inspired mathematical description of the network formation. Like the slime mold, the model first creates a fine mesh network that goes everywhere, and then continuously refines the network so that the tubes carrying the most cargo grow more robust and redundant tubes are pruned.
The behavior of the plasmodium “is really difficult to capture by words,” comments biochemist Wolfgang Marwan of Otto von Guericke University in Magdeburg, Germany. “You see they optimize themselves somehow, but how do you describe that?” The new research “provides a simple mathematical model for a complex biological phenomenon,” Marwan wrote in an article in the same issue of Science.
source: wired.com
Architecture is
As our capabilities for understanding phenomena becomes more granular and fine and processing power enables us to accurately map them, we can quantitatively calculate and thus materialize an interface that mediates between these domains, one that responds not only to material systems, but to flows of asomatous phenomena (climate, Hertzian Space, economics, sound, affect and so on). Almost all trends in contemporary technology and science point to the development of ubiquitous and ambient models based on fine granulation. Examples are parallel in almost every field from the development of the genetic computation and fitness function, the Semantic Web (GGG) as a single global machine that is materializing from all the computing bits and tagged objects in the world, the design of implicit interactions and distributed agents[1] to finally the understanding that the universe is made of bits and it is storing and processing information in the quantum realm [2]
A physical structure -being the medium of architecture- forms spontaneously as these fine components try to meet energetic requirements and seek a point of minimal free energy.
In this respect the architectural form, moving rapidly from the tradition of being the node itself, operates in an in-between fuzzy mode, an interface that exists only by means of connecting the nodes in the global network of said phenomena.[1] "The most profound technologies are those that disappear” “[they] weave themselves into the fabric of everyday life until they are indistinguishable from it.” The Computer for the 21st Century by Mark Weiser, Xerox PARC, 1991. He also coined the term Ubiquitous Computing.
[2] Seth Loyd interview with WIRED magazine Issue 14.03 - March 2006