Member 1220
20 entries
109873 views

 RSS
Chris Weige (M)
Austin, US
Immortal since Dec 23, 2007
Uplinks: 0, Generation 2

Reckon Wordwide
Reckon Blog
Tumblr
Chris Weige
Elsewhere
Twitter
Poet | TX
  • Affiliated
  •  /  
  • Invited
  •  /  
  • Descended
  • Reckon’s favorites
    From Ronald Frederick
    the virtual revolution
    From notthisbody
    memetic strategies
    From notthisbody
    From Citizen Kane to...
    From gavink
    entanglement ( personas /...
    From Rourke
    'The Thing Itself' :...
    Recently commented on
    From Reckon
    Tokyo Sky Drive
    From Reckon
    Hold on a sec
    From Reckon
    The Third Eye - Roy Ayers
    From First Dark
    /// What Are We Here For?...
    From Wildcat
    Reality, an aesthetic...
    Now playing SpaceCollective
    Where forward thinking terrestrials share ideas and information about the state of the species, their planet and the universe, living the lives of science fiction. Introduction
    Featuring Powers of Ten by Charles and Ray Eames, based on an idea by Kees Boeke.
    Source: WIRED

    The Robotic Musicianship Group at Georgia Tech Center for Music Technology just blew our minds with some videos depicting robots playing music with real people. Great, you say. Some fake robot machine can, like, bang around on a drum or something. Not so fast, doubters.

    These robots, developed with funding from the National Science Foundation, listen to humans creating music in real time and play along with them. One might say they improvise.

    They can't pass the actual Turing test, in which a robot must fool a human into thinking it is also a human during a conversation.

    But musical improvisation is another kind of a conversation and I, a human, would believe that the impromptu, non-predetermined parts these robots play were played by other humans. By that standard, the Georgia Tech team's robots have already passed the musical Turing test (assuming that I'd feel the same way if I were playing along instead of watching the robots in a video, and I'm fairly certain I would).



    We asked professor Gil Weinberg, head of the program, how these robots manage to parse what humans are playing, and how they manage to play along. How do they figure out which parts to play? As it turns out, the process is somewhat analogous to the way Deep Blue plays chess: by carefully examining its options and then evolving them like biological species to see which one best fits a changing musical environment.

    "The processing allows [the robots] to analyze and improvise," said Weinberg via telephone. "In one of the applications, we use a genetic algorithm... You have a population of something, and then you do mutations to all of these little things — in my case it's musical motifs — mutations and cross-breeding between the musical genes, in our case, and then you have a new population that better fits to the environment.

    He continued, "Very fast, it runs [about] 50 generations of mutations that are cross-bred between the genes and tests whether this is similar to a motif that the saxophone player played, for example. And it plays something back that is a combination of musical genes of what the saxophone player played, what the piano player played — something that is unique that only can be the product of genetic algorithm."



    The results are fairly astounding. Haile, the drumming robot has been around for a couple of years, but Shimon, the marimba playing robot unveiled in early November, handles melody in addition to rhythm. One of the next steps, says Weinberg, is to give the robots to look at whichever human collaborator is playing the most interesting part.
    Wed, Nov 26, 2008  Permanent link
    Categories: technology, science, music, robotics
      RSS for this post
      Promote (2)
      
      Add to favorites
    Synapses (1)
     
          Cancel