2009-02-28

Nano-origami

ScienceDaily: Your source for the latest research news  and science breakthroughs -- updated daily

Nano-origami Used To Build Tiny Electronic Devices

A team of researchers led by George Barbastathis, associate professor of mechanical engineering, is developing the basic principles of "nano-origami," a new technique that allows engineers to fold nanoscale materials into simple 3-D structures. The tiny folded materials could be used as motors and capacitors, potentially leading to better computer memory storage, faster microprocessors and new nanophotonic devices.

clipped from www.youtube.com

Nano origami

clipped from web.mit.edu
Link: MIT

clipped from web.mit.edu

Nanostructured Origami™ Fabrication and
Assembly Process


Figure
1 Bridging the gap between nanoscale and macroscale.



(a)
(b) (c)
Figure 2 (a) Planar fabrication. (b) Membrane folding.
(c) Completed device.

FOLDING
METHODS


Figure 4: Membrane folding via Lorentz force


Figure 5: SEM image of 360o fold.


Figure 6: Overhead view of curling.

COMPLETED
DEVICES


Figure 7: Gratings of nanoscale feature size integrated
into folding device

clipped from meche.mit.edu

George Barbastathis

George Barbastathis

blog it

Related:
Nano-origami Used To Build Tiny Electronic Devices
3D Optical Systems Group
Knowing when to fold - MIT News Office
MIT MechE - George Barbastathis
Researchers use nano-origami to build tiny 3D devices - Engadget
Nano-origami - Boing Boing
Nano World: Nano origami supercapacitors
Slashdot | Folding Nanosheets To Build Components

2009-02-27

Einstein Robot

Clipped from: YouTube - Einstein Robot - UCSD Machine Perception Laboratory

Einstein Robot - UCSD Machine Perception Laboratory

Scientists at UC San Diego's California Institute for Telecommunications and Information Technology (Calit2) have equipped a robot modeled after the famed theoretical physicist with specialized software that allows it to interact with humans in a relatively natural, conversational way. The so-called "Einstein Robot," which was designed by Hanson Robotics of Dallas, Texas, recognizes a number of human facial expressions and can respond accordingly, making it an unparalleled tool for understanding how both robots and humans perceive emotion, as well as a potential platform for teaching, entertainment, fine arts and even cognitive therapy.



Clipped from: Introducing the scarily realistic Einstein robot who can tell how you feel | Mail Online


Introducing the scarily realistic Einstein robot who can tell how you feel


With a big bushy moustache and a shock of white hair, scientists have modeled an ultra realistic robot on Albert Einstein. [...] Using specialised software the machine can recognise and respond to a number of human facial expressions in a natural way.

Clipped from: Calit2 : It's All Relative: UC San Diego's Einstein Robot Has 'Emotional Intelligence'

Calit2

"In the short-term, Einstein is being used to develop computer vision so we can see how computers perceive facial expressions and develop hardware to visually react," says Javier Movellan, a research scientist in the Calit2-based UCSD Machine Perception Laboratory (MPL). "This robot is a scientific instrument that we hope will tell us something about human-robot interaction, but also human-to-human interaction.

"When a robot interacts in a way we feel is human, we can't help but react. Developing a robot like this one teaches us how sensitive we are to biological movement and facial expressions, and when we get it right, it's really astonishing."

[...]

Another important part of the robot's inner workings is its Character Engine Artificial Intelligence Control Software, which allows the programmer to author and define the persona of the character so it can hold a conversation.

"Einstein has pretty broad conversational abilities, although not like a human," Hanson notes. "In the demo mode, he might say something like, 'I'm an advanced perceptual robot bringing together many technologies into a whole that's greater than the sum of my parts, but here's what some of my parts can do. I can see your facial expressions and mimic them. I can see your age and gender. So why don't we demo some of these technologies?'"

Clipped from: Einstein Returns As A Robot

MyFoxTampaBay.com logo_20090128120301669_PNG

Einstein Returns As A Robot




The robot can recognize and mimic 5,000 facial expressions and is powered by 31 motors and innovative new technology. The Einstein robot could be the next big thing in the world of engineering.

Javier Movellan, a professor at the college, says: "The best expressions that it can recognize right now are expressions of sadness and happiness."

Movellan adds, "You know that children with autism have problems recognizing facial expressions and producing facial expressions in social communication. So we're trying to see whether this robot will actually be useful and helpful to teach children with autism how to communicate with human beings."

Scientists hope to use the technology to improve the way computers relate to people.


Related:
Introducing the scarily realistic Einstein robot who can tell how you feel | Mail Online
Calit2 : It's All Relative: UC San Diego's Einstein Robot Has 'Emotional Intelligence'
MACHINE PERCEPTION LABORATORY
Einstein Returns As A Robot
Einstein robot smiles when you do | Technology | Reuters
It's All Relative: UCSD's Einstein Robot Has 'Emotional Intelligence' (Video)
NewsDaily: Einstein robot smiles when you do
It's All Relative UC San Diego's Einstein Robot Has 'Emotional Intelligence'
FOXNews.com - Einstein Robot Head Dazzles Tech Conference - Science News | Science & Technology | Technology News
Is Albert Einstein robot too human? Everything’s relative - Times Online

2009-02-26

A Hard Days Math

Clipped from: YouTube - A Hard Day's Night

A Hard Day's Night


Clipped from: Beatles "A Hard Day's Night" Chord Mystery Solved Using Fourier Transform

"It’s been a hard day’s night
And I’ve been working like a dog"

The opening chord to "A Hard Day’s Night" is also famous because, for 40 years, no one quite knew exactly what chord Harrison was playing.


Clipped from: The "A Hard Day's Night" Chord - Rock's Holy Grail@Everything2.com

"The Songwriting Secrets of The Beatles") summarises 21 different interpretations of the famous chord - just a mere selection of the interpretations he found in his research. Here are a few candidates suggested over the years:
  • A dominant 9th of F in the key of C
  • G-C-F-Bb-D-G
  • C-Bb-D-F-G-C in the key of C
  • A polytriad ii7/V in Ab major
  • G7sus4 (open position)
  • D7sus4 (open position)
  • G7 with added 9th and suspended 4th
  • A superimposition of Dm, F, and G
  • Gsus4/D
  • G11sus4
  • G7sus7/A
  • Dm11 with no 9th
  • Gm7add11
  • G9sus4/D

Clipped from: Beatles Unknown "A Hard Day's Night" Chord Mystery Solved Using Fourier Transform


Banner




There were theories aplenty and musicians, scholars and amateur guitar players all gave it a try, but it took a Dalhousie mathematician to figure out the exact formula.

Four years ago, inspired by reading news coverage about the song’s 40th anniversary, Jason Brown of Dalhousie’s Department of Mathematics decided to try and see if he could apply a mathematical calculation known as Fourier transform to solve the Beatles’ riddle. The process allowed him to decompose the sound into its original frequencies using computer software and parse out which notes were on the record.
Clipped from: A Hard Day's Night (song) - Wikipedia, the free encyclopedia
According to Brown, the Rickenbacker guitar wasn't the only instrument used. "It wasn't just George Harrison playing it and it wasn't just the Beatles playing on it... There was a piano in the mix." Specifically, he claims that Harrison was playing the following notes on his 12 string guitar: a2, a3, d3, d4, g3, g4, c4, and another c4; McCartney played a d3 on his bass; producer George Martin was playing d3, f3, d5, g5, and e6 on the piano, while Lennon played a loud c5 on his six-string guitar.[25]

Clipped from: Professor Uses Mathematics to Decode Beatles Tunes - WSJ.com

Math Professor Figures Formula for Beatles Success


Jason Brown listens to the Beatles with a uniquely analytical ear. The mathematics professor at Dalhousie University in Halifax, Nova Scotia, says he's figured out the math behind the best of the Fab Four. Now, using "mathematical tricks" he's picked up from the band, he's written a very Beatles-esque song of his own. WSJ's Christina Jeng reports.



Related:
Beatles "A Hard Day's Night" Chord Mystery Solved Using Fourier Transform
The "A Hard Day's Night" Chord - Rock's Holy Grail@Everything2.com
Beatles Unknown "A Hard Day's Night" Chord Mystery Solved Using Fourier Transform
Professor Uses Mathematics to Decode Beatles Tunes - WSJ.com
Beatles hard days night mystery chord Solved with Fourier analysis | NoiseAddicts music and audio blog
A Hard Day's Night (song) - Wikipedia, the free encyclopedia
Mathematician Cracks Mystery Beatles Chord

2009-02-25

Anybots QA -- A Telepresence Robot

Clipped from: Telepresence - Wikipedia, the free encyclopedia

Telepresence

Telepresence refers to a set of technologies which allow a person to feel as if they were present, to give the appearance that they were present, or to have an effect, at a location other than their true location.

Telepresence requires that the senses of the user, or users, are provided with such stimuli as to give the feeling of being in that other location. Additionally, the user(s) may be given the ability to affect the remote location. In this case, the user's position, movements, actions, voice, etc. may be sensed, transmitted and duplicated in the remote location to bring about this effect. Therefore information may be traveling in both directions between the user and the remote location.

Clipped from: YouTube - Anybots QA Telepresence Robot - TFOT

Anybots QA Telepresence Robot - TFOT

The California based company Anybots demonstrated its telepresence QA robot during CES 2009. The QA robot weights 30 pounds (14 kg), has a height of 5 feet (152 cm) standing and 2 feet (61 cm) bending and will be used for telepresence, allowing a user to control the robot from across the globe using the internet while still seeing and being seen, talk and listen, and collaborate in ways and places never before possible.


Clipped from: Anybots

Anybots logo

Experience Telepresence




QA operates simply, cleanly, and quietly while still giving you a full physical presence. It allows you to see and be seen, talk and listen, and collaborate in ways and places never before possible.

Technical Specifications

  • Batteries: rechargeable Li ion, 4-6 hours of operation
  • Connectivity: 802.11g wireless (optional 3G cellular)
  • Cameras: two 5 MP color, with ir illuminator
  • Video: 20 FPS @ 640×480 (depending on network)
  • Audio: full duplex, high fidelity
  • Display: 7 inch (18 cm) color LCD in chest
  • Laser pointer: green 10 mW, points and draws shapes
  • Navigation: LIDAR, 5.5 yard (5 meter) range
  • Speed: up to 6 MPH (10 km/h)
  • Wheels: two 12 inch (30 cm) diameter rubber
  • Height: 5 foot (152 cm) standing, 2 foot (61 cm) bending
  • Weight: 30 pounds (14 kg)
  • Client software: PC and Mac compatible
Clipped from: YouTube - Anybots' New Telepresence Robot QA Video1

Anybots' New Telepresence Robot QA





Related:
Telepresence - Wikipedia, the free encyclopedia
Anybots
GetRobo: QA - New Telepresence Robot from Anybots
Anybots rolls out QA, the telegenic telepresence robot- Engadget
Anybots’ QA telepresence robot
Live From CES: Will Physically Going to CES Become Obsolete? | Discoblog | Discover Magazine
Anybot Telepresence Robot » Coolest Gadgets


2009-02-24

Sensacell a Smart Human Interface Technology

Clipped from: +SENSACELL+

Clipped from: digitalexperience » Blog Archive » Sensacell
The Sensacell system consists of 6 inch x inch modules which comprise interactive sensor surfaces that can be assembled to any size or shape. The modules contain brightness LED lightning arrays available in many colours, shapes and sizes. This provides a wide variety of visual feedback possibilities.
Clipped from: Sensacell RGB Color Module Demo

Sensacell RGB Color Module Demo

The Sensacell system is a revolutionary human interface technology ideal for smart architecture, interactive multimedia, retail entertainment, and a host of exciting new applications. This video shows the HS61-36-RGB module in action



Clipped from: Sensacell Storefront Window Kiosk Demo

Sensacell Storefront Window Kiosk Demo


Clipped from: Sensacell Corporation Provides Interactive Floor System for the 2008 World Expo, in Zaragoza Spain.

Sensacell Corporation Provides Interactive Floor System for the 2008 World Expo, in Zaragoza Spain.

Sensacell Corporation announces its contribution to the 2008 World Expo, themed "Water and Sustainable Development." The floor graces the entrance to the Spain Pavilion, "Comunitat Valenciana." Visitors walking across the floor leave behind luminous footprints, a striking visual metaphor for the "footprint" created by the actions of mankind in his environment. The 250 Square foot (25 square meter) floor system consists of over 1000 Sensacell HSI64-36-W interactive modules, each module contains capacitive sensors and a LED lighting system, the capacitive sensors allows the floor to detect and respond to visitors footsteps right through the 20 mm thick architectural glass that forms the surface of the floor.




Related:

+SENSACELL+
digitalexperience » Blog Archive » Sensacell
Sensacell Corporation Provides Interactive Floor System for the 2008 World Expo, in Zaragoza Spain.
digitalexperience » Blog Archive » Sensacell interactive floor