November 3, 2011

OmniTouch turns any surface into a touchscreen interface

OmniTouch displays interface on user's hand
OmniTouch displays interface on user's hand

Had Shakespeare been born several centuries later, he might have said "All the world's an interface," especially if he'd had a chance to play with the recently-developed, wearable OmniTouch system. While interactive interface projectors are far from new, this innovative concept design utilizes a different approach that promises to turn just about any solid surface into a touch-sensitive input device. Books, tables, walls, hands and other body parts, it's all fair game.

The OmniTouch wearable interface projectorShoulder-mounted OmniTouch prototypeShoulder-mounted OmniTouch prototypeOmniTouch displays interface on user's arm

In its current proof-of-concept iteration, which was prototyped at idea-rich Microsoft Research in Redmond, Washington, by PhD student Chris Harrison and his team, the rough-hewn shoulder mounted device resembles a sci-fi prosthetic weapon, but looks can be deceiving.
"We explored and prototyped a powerful alternative approach to mobile interaction that uses a body-worn projection/sensing system to capitalize on the tremendous surface area the real world provides," explains Harrison.

Like the proverbial "better" mousetrap, the concept of mobile interaction seems prone to constant tinkering. The OmniTouch draws from a blend of disciplines to overcome numerous issues that beset similar devices. Some approaches require placing markers on the fingertips but still can't discern whether the fingers are "clicked" (touching the surface) or hovering. Others can't "read" surfaces beyond those of the user's own body or they lack the ability to respond to touch/drag motions.

To surmount these hurdles, Harrison and his colleagues combined a PrimeSense short-range depth camera with a Microvision ShowWX+ laser pico-projector. The camera generated a 320x240 depth map at a rate of 30FPS, even for objects as close as 8 inches (20cm). The projector delivered a sharp, focus-free, wide-angle image independent of the surface's distance - a useful property in such applications. Both devices were then linked to a desktop computer.

The OmniTouch gets its edge in finger position detection through a complex series of calculations that begins with the generation of the depth map. The second video below contains a detailed description of the process which enables the device to determine whether one's fingers are floating above a surface of actually contacting it. The inputs yielded closely approximate those of touchscreens and mice, so the possibilities for the OmniTouch are seemingly endless. Let's hope the wait for a commercial version isn't.

The paper, OmniTouch: Wearable Multitouch Interaction Everywhere, by Chris Harrison, Hrvoje Benko and Andy Wilson was presented in the Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology (Santa Barbara, California, October 16 - 19, 2011)

© 2011 Electroclub

November 2, 2011

Robot that can climb Wall

The TBCP-II can transfer from horizontal to vertical surfaces over inside corners (as seen...
The TBCP-II can transfer from horizontal to vertical surfaces over inside corners (as seen here) and outside corners

Researchers have used a variety of technologies to give robots the ability to climb walls, such as magnets, rolling seals and electro-adhesive properties. But when it comes to wall-climbing robots its hard to go past the humble gecko for inspiration. The gecko's specialized toe pads containing hair-like structures that allow it to scale smooth vertical surfaces have already provided inspiration for the four-legged Stickybot and now researchers at Canada's Simon Fraser University Burnaby (SFU) claim to be the first to apply the gecko's wall-climbing technique to a robot that operates like a tank.
The researchers created adhesives that mimic the dry, but sticky toe pads of the gecko, also known as dry fibrillar adhesives, by using a material called polydimethylsiloxane (PDMS) that was manufactured to contain very small mushroom cap shapes that were 17 micrometers wide and 10 micrometers high.
"The thin, flexible overhang provided by the mushroom cap ensures that the area of contact between the robot and the surface is maximized," says Jeff Krahn. "The adhesive pads on geckos follow this same principle by utilizing a large number of fibers, each with a very small tip. The more fibers a gecko has in contact, the greater attachment force it has on a surface."
The researchers say applying the adhesive to tank-like robots driven by belts instead of legs offers several advantages. Tank-like robots have a simplified mechanical design and control architecture and also boast increased mobility and can be easily expanded if there is the need to increase the load a robot is carrying.
The 240 g (8.46 oz) robot developed by the SFU researchers, which has been given the catchy name of the Timeless Belt Climbing Platform (TBCP-II) has been fitted with a multitude of sensors that allow it to detect its surroundings and change its course accordingly. It is also able to transfer from a flat horizontal surface to a flat vertical surface over both inside and outside corners at speeds of up to 3.4 cm/s (1.34 in/s).
The SFU researchers say the wall-climbing technology employed in TBCP-II has wide-ranging potential applications, including inspecting pipes, buildings, aircraft and nuclear power plants, and in search and rescue operations.
The team's study A tailless timing belt climbing platform utilizing dry adhesives with mushroom caps was published today in the journal Smart Materials and Structures.

The video below shows the TBCP-II climbing a whiteboard and transitioning from a horizontal surface to a vertical surface around an outside corner.

                                                                  © 2011 Electroclub

World's fastest laptop GPU Title

NVIDIA GeForce GTX 580M Claims 

Alienware M18x is the first gaming laptop to offer GTX 580M GPU and has the option of comb...

Alienware M18x is the first gaming laptop to offer GTX 580M GPU and has the option of combining two in SLI mode

Hardcore mobile gaming fans have a powerful new graphics processing option in the form of NVIDIA's GeForce GTX 580M. Billed as the world's fastest notebook GPU and capable of playing the latest games at 1080p in 3D, the new chip can be found under the bonnet of the Alienware M18x and is soon to be featured in the Alienware M17x.

The card features 384 CUDA cores with a processor clocked at 1240MHz and up to 2GB GDDR5 memory clocked at 1500MHz. NVIDIA says that Alienware M17x equipped with GTX 580M using NVIDIA Optimus technology can deliver up to 100fps in Call of Duty: Black Ops and up to 5 hours of battery life while internet browsing, while the Alienware M18x offers the option of doubling gaming performance by having two SLI-paired GTX 580M GPUs on board.
Running on PCI-E 2.0 bus, the GTX 580M is DirectX 11-enabled and supports many NVIDIA's technologies such as:
  • NVIDIA 3D Vision technology - a system which automatically converts 2D content (games, movies) into stereoscopic 3D;
  • NVIDIA 3DTV Play software - allows for connecting 3D Vision-based notebooks to 3D TVs;
  • NVIDIA SLI - links two GTX GPUs which work simultaneously;
  • NVIDIA PhysX engine - a system for realistic, GPU-accelerated in-game physics;
  • NVIDIA Optimus - enables switching the GPU on and off for longer battery life;
  • NVIDIA CUDA architecture - used for GPU-accelerated general purpose computing.
NVIDIA also introduced GeForce GTX 570M laptop GPU, running at 1150MHz with 336 CUDA cores and GDDR5 memory on a 192-bit bus running at 1500MHz.
The company has published a 3DMark Vantage performance diagram which shows the comparison of its high-end laptop GPUs as well as 3DMark scores of the newest arrivals.

                                                    © 2011 Electroclub

November 1, 2011

Robot Solves Sudoku

Sudoku Solver
This little robot solves a sudoku puzzle all by itself.
It scans the sudoku puzzle using a light sensor. It calculates the solution to the puzzle and then writes the digits.


It first performs a quick scan to determine which cells are empty. It sweeps the light sensor over the sudoku puzzle while continuously reading the sensor values.
Each non-empty cell is then handled. An image of the digit is created by scanning the cell line by line moving the robot forward a tiny bit for every line.


The scanned image is very noisy and blurred.
To make it more sharp the image is transformed to a binary image (black and white).
Each pixel with a value below a threshold value is set to black. And the rest of the pixels are set to white.
The values from the light sensor are very close to each other and the overall values are affected by surrounding light, distance between surface and sensor and so on.
The threshold value is calculated by applying an automatic threshold algorithm (The Otsu Method).


The binary image may have more than one segment. It may have pixels not belonging to the digit. The center segment is found and other segments are deleted.


The image is converted to 1-pixel wide lines with a thinning algorithm.

Feature extraction

Now the digits are classified by extracting some features from them:

  • Width of the digit
  • Number of tips
  • Position and direction of tips
For example a digit with only one tip which is pointing to the right is classified as 6.
Raw image Thresholding Segmentation Thinning

Solving the Sudoku

Solving a sudoku puzzle is a quite simple task for a computer. It is usually implemented using a recursive backtracking algorithm. But since the Mindstorms processor is rather slow, and since it doesn't allow for recursive functions, it took some care to optimize it.

                                                             © 2011 Electroclub

October 31, 2011

The 300kW Varley evR450 Electric Sportscar

The Varley evR450

Another high-performance electric sportscar broke cover last week from an unlikely source. To be built by a 125 year-old Australian company specializing in aerospace, defense and electric vehicles, the less-than-US$215,000 gull-winged Varley evR450 was on display at the Third Annual Australian Electric Vehicle Conference.
Composites will enable the evR450 to be "featherweight" and development relationships with other EV specialists Tritium and Ultramotive have yielded some significant advantages for the vehicle, though we're unlikely to see those advantages announced until 2012.
The evR450 is believed to be running two Ultramotive CARBON motors, each with a peak power rating of 150kW (and 522 Nm of torque) and each offering continuous power of 58 kW (122 Nm). As the motors weighing only 40kg (excluding inductors), evR's 300 kW of peak power will be pushing a very light vehicle, and with in excess of 1000 Nm of torque, acceleration can be expected to be blistering.

The company is claiming a 0-100 km/h time of 3.8 seconds and an electronically-limited top speed of 200 km/h (125 mph). The evR450's range of 150 km (93.21 miles) can be doubled with an additional (but obviously heavier) lithium-ion battery pack.
The evR450 is powered by Lithium Ion cells in a 24Kwh pack and can be easily charged by plugging in to mains power. The Inverters that control the AC motors are described as "state of the art", coming off the back of years of development in the electric solar racing industry.

© 2011 Electroclub

Amazing Robot With Eyes

Meka's robot head makes eyes at next-wave users (w/ video)

You've seen robots in the form of alpha dogs and insect swarms capable of military deployments but the next wave of robotics also includes a special breed of "sociable" robots for use in medicine, eldercare, and education. These are the robots that demand scientists’ skills toward making robotic parts that can mimic their human counterparts, and be responsive to social cues.
Noopy 2-Wheel Robot Kit - Open source arduino-powered robot kit from Japan -
Small wonder that human heads were turning last month at the IROS (Intelligent Robots and Systems) 2011 event in San Francisco where visitors met the most sociable humanoid head around, attached to a torso and arm assembly. The robot belongs to Professor Luis Sentis of University of Texas at Austin.
The Meka robot goal was to design a head that can elicit a sense of trust and sociability. In viewing the head, it is difficult not to feel an affinity, even though the head is not at all truly representational of a human head but rather is a curious blend of anime and cartoon character searching for a DreamWorks home.

The head carries much expressiveness and the focal resting points are the engaging eyes. Meka co-founder Aaron Edsinger fielded questions about the robot at the event, and provided technical details: The humanoid head is a seven degree-of-freedom robotic active vision head. Designed for a wide range of expressive postures, it is a useful platform for researchers interested in human-robot interaction and social robotics. 

The system features high resolution FireWire cameras in each eye, integrated DSP controllers, zero-backlash Harmonic Drive gearheads in the neck, and Meka M3 and ROS software stacks. When asked why the little ears, he answered, "That was my idea. I had a puppy, so they were inspired by that."
Edsinger also noted that the robot involved the Meka team and the Human Centered Robotics Group at UT Austin. At UT Austin, Professor Luis Sentis is known for his work with force augmentation, which looks into how robots can help to augment the movements of humans. ”We worked closely with Professor Sentis to achieve a friendly anime aesthetic for the overall look, adding two 2DOF ears with RGB lighting for added expressive ability.”
Meka Robotics, in addition to the humanoid head, offers specialized parts--hands, grippers, arms, torsos. The company supports the Willow Garage Robot Operating System (ROS), an open source platform for personal robots, including RVIZ kinematic visualization, URDF descriptions, posture control of all joints, and common sensor interfaces.
Meka is a 2006 spin-off from the MIT Computer Science and Artificial Intelligence Lab, where its founders, Edsinger and Jeff Weber, developed the Domo humanoid robot. Weber’s work with prosthetic ankles and knees at the MIT Media Lab led to the first commercially available powered prosthetic ankle.
© 2011 Electroclub

Chinese Bots Play TT

Ping-pong robots debut in China (w/ video)

Last week some oohs and ahhs were in order as two ping-pong playing robots made their debut at Zhejiang University in China. The two robots played against each other and with humans. True, this was not the first time the world witnessed robots playing table tennis. There was Topio, a robot made by Vietnamese robotics firm, TOSY.

Topio was a Terminator-looking robot with a chiseled torso and formidable size. Its processors and artificial neural network were able to analyze the ball’s path. There was elsewhere a show of flying quadrocopters whacking the ball back to a human player at the Swiss Federal Institute of Technology's Flying Machine Arena in Zurich.

Last week's debut of the robots in China was impressive nonetheless.
The Zhejiang University robots served, returned balls, and scored with the aid of their technology. Each robot has motorized joints that provide backhand and forehand capabilities and a number of arm movements. Each robot is 1.6 meters tall and weighs 55 kg.
Xiong Rong, the director of the university's Robotics Laboratory, said the 30-joint robots took them four years to develop.

Ping-pong robots debut in China (w/ video)
Setting out to create robots who can play a game like ping-pong is challenging enough; the game requires especially fast response times and fast reflexes. The Chinese team responded to the challenge. They gave the robots eye-mounted cameras that capture 120 images per second. The images are transferred to the robots' processors and they respond, calculating the ball's position, speed, angle, landing position and path. Overall, the university team responsible for the robots is being credited as having come up with an identification and positioning system of enough merit to compete with human players.
If intuition is an edge for human players, the robots use mechanical precision to contend. According to reports, their ability to predict the ball's landing position is quite accurate, with a margin of error of 2.5 cm.
Still, reports say that more advanced ping-pong players with special skills can beat the robots, even with the robots’ accuracy and fast speed. The robots cannot perform complicated ping-pong techniques such as slicing and curving.
The robot program of Zhejiang University is part of a high-tech development plan, designed by the Ministry of Science and Technology, to stimulate the development of advanced technology in biotechnology, information technology and automation. The two robots are described as an exercise to demonstrate the range of possibilities of robotic technologies. The university also aims to develop robots that can do housework.
© 2011 Electroclub

Next-generation charging technology for environmentally friendly vehicles

Next-generation charging technology for environmentally friendly vehicles

APEI's high performance SiC power module technology for increased efficiency and power density. Credit: Arkansas Power Electronics International, Inc.
Plug-in electric vehicles represent a new direction for environmentally friendly transportation. Unfortunately, plug-in electric cars are currently grid-tie power electronics that can require large quantities of energy -- and time -- to charge. As plug-in cars become more and more widely used, large amounts of power will be required to quickly charge these vehicles.

Arkansas Power Electronics International (APEI) is one of the companies working on a solution to this challenge. A small research and development company based in Fayetteville, Arkansas, APEI's goal is to build state-of-the-art technology for the development and application of power electronics.
The Department of Energy's research agency has included APEI as one of the agencies to fund, helping to develop more energy efficient power electronics. As part of the Agile Delivery of Electric Power Technology project, APEI's research will help create a power module that can support the demands of plug-in electric vehicles.
Improved semiconductors
APEI has spent the last 10 years working on a way to implement silicon carbide semiconductors into its power electronics to replace standard silicon semiconductors. Silicon carbide semiconductors are applied in situations where extreme heat and harsh environment are commonplace, such as the wing of an aircraft or the hood of a hybrid car. Because of the extreme conditions, silicon carbide semiconductors are built to withstand potential temperatures in excess of 600 degrees Celsius.
Modern silicon semiconductors generally can't handle temperatures higher than 150 degrees Celsius. Heat is no longer a limitation when designing silicon carbide power modules, but is instead a design factor. The silicon carbide power module that APEI helped develop along with the University of Arkansas won an R&D 100 award in 2009 for being one of 100 new global technological breakthroughs.
"Silicon carbide allows a lower on-resistance for a given blocking voltage versus traditional silicon," said Ty McNutt, director of business development at APEI. A lower on-resistance has profound advantages for a semiconductor. "Smaller and faster switches can be fabricated with less switching and conduction losses," said McNutt. APEI's silicon carbide semiconductors are more energy efficient than silicon semiconductors.

Performance power modules
As a result of the silicon carbide semiconductor's development, APEI also designed a new power module that can help provide the power conversion necessary to charge plug-in electric vehicles. "The advantages are many, from higher efficiency to reduced size and weight enabled by high frequency operation," said McNutt.
The new power module is called the multichip power module, and is designed to be a very compact, cost-efficient, lightweight solution for the plug-in vehicle's charging dilemma. APEI's patented power module technology integrates both the power and control circuitry into one compact power module.
The development of silicon carbide semiconductors has led to the need for power modules that will reduce cost and increase efficiency for power electronics. "APEI, Inc.'s multichip power module technology is designed around the silicon carbide components," said McNutt. Because of the "ultra-high speed switching for greater efficiency... the power modules are also capable of temperatures in excess of 250 degrees Celsius, offering the end user greater thermal headroom over traditional silicon electronics."
Taking the heat
Since silicon carbide semiconductors operate at such high temperatures, the thermal management system within the power module does not have to play such an integral role in the module's function. With a lighter and smaller thermal management system, the multichip power module can be much smaller.
APEI's new power modules aim to produce a power module that charges at an efficiency rate of greater than 96 percent while most modern power modules today only charge at efficiencies of less than 92 percent.
APEI's power module technology is also a very high power density module. The power output per kilogram for APEI's silicon carbide power module is 25 kilowatts, while other "state-of-the-art" power modules only put out 2.5 kilowatts per kilogram.
Future of plug-ins
"The higher temperature capability and higher switching frequency one can achieve by combining these two technologies will allow power electronics systems to obtain a tenfold reduction in size and weight if the system is designed around the technology," said McNutt.
APEI's charging module is one of the new technologies paving the way for green energy vehicles in the future. Weight reduction and increases in efficiency make the future look promising for technologies implementing silicon carbide technology. Electric vehicles offer an innovative direction for personal transportation, especially as rapid-charging is developed to make them more convenient.
As plug-in hybrids continue to become more and more widely available, it's very possible that gasoline-powered vehicles will no longer be the most popular option for personal transportation.
 © 2011 Electroclub

Venus Flytrap Robot

Mohsen Shahinpoor, a professor of mechanical engineering at the University of Maine has created a robot version of the infamous bug eating Venus Flytrap, using a material he invented himself several years ago. Named ionic polymeric metal composite (IPMC), it's a nanomaterial that can be  used to mimic muscle function. Shahinpoor describes in his paper published in Bioinspiration & Biomimetics, how he used this material to recreate the sensing and closing abilities of the Venus Flytrap.

To make the robot, Shahinpoor fashioned two “leaves” out of the material he’d created to mimic the mouth-like appearance of the Flytrap. He then created a spine from a roll of copper. Then to copy the tiny hairs on the Flytrap that function as the sensors, he added very small strips of the IMPC material. The rest of the robot was constructed as a normal relay and voltage generation system.
The new material in the robot works in two ways. First, because of the unique properties of the IPMC material, simply touching it causes a very small voltage to be generated. The second is the muscle type flexing or bending, exhibited by the material when given a charge. With the robot, the small strips serve as sensors, that when touched, relay a tiny charge to the voltage generator which sends a little bit bigger charge to the “leaves” causing them to bend inwardly towards one another, or in viewing it, as a mouth closing on its prey.
The material Shahinpoor invented bends when given a charge due to the redistribution of ions.
Mechanical engineer creates robot Venus Flytrap
Side view design of the robotic VFT in open (a) and closed (b) configurations. Image: Bioinspir. Biomim. 6 046004. doi:10.1088/1748-3182/6/4/046004
At first glance, the creation of the robot might not seem like much more than a fun little project for an engineer who likes to tinker. On closer inspection however, it becomes clear that the robot Venus Flytrap is actually a demonstration of a new kind of technology that may lead to advances in medical applications. A refined application, for example, could perhaps one day lead to human muscle replacements or help with sensory applications. One example might be restoring facial expressions to people who lose the ability due to a stroke. 
Also, because Shahinpoor’s robot Flytrap has demonstrated an ability to catch flies it’s possible that such technology could also one day be used as a means for allowing robots to feed themselves, thus alleviating the need for a power supply.
More information: Biomimetic robotic Venus flytrap (Dionaea muscipula Ellis) made with ionic polymer metal composites, Mohsen Shahinpoor, 2011 Bioinspir. Biomim. 6 046004. doi:10.1088/1748-3182/6/4/046004
The work described in this paper is a novel design of a robotic Venus flytrap (VFT) (Dionaea muscipula Ellis) by means of ionic polymeric metal composite (IPMC) artificial muscles as distributed nanosensors and nanoactuators. Rapid muscular movements in carnivorous plants, such as VFT, which are triggered by antenna-like sensors (trigger hair), present a golden key to study distributed biomolecular motors. Carnivorous plants, such as VFT, possess built-in intelligence (trigger hairs), as a strategy to capture prey, that can be turned on in a controlled manner. In the case of the VFT, the prey that is lured by the sweet nectar in the VFT pair of jaw-like lobes has to flip and move the trigger hairs, which are colorless, bristle-like and pointed. The dynamically moved trigger hairs then electro-elastically send an electric signal to the internal ions in the lobe to migrate outwardly for the jaw-like lobes to close rapidly to capture the prey. The manner in which the VFT lobes bend inward to capture the prey shows a remarkable similarity with typical IPMCs bending in an electric field. Furthermore, the mechano-electrical sensing characteristics of IPMCs also show a remarkable resemblance to mechano-electrical trigger hairs on the lobes of the VFT. The reader is referred to a number of papers in connection with sensing and actuation of IPMCs in particular. Thus, one can integrate IPMC lobes with a common electrode in the middle of one end of the lobes to act like a spine and use IPMC bristles as trigger finger to sense the intrusion of a fly or insect to send a sensing signal to a solid state relay which then triggers the actuation circuit of the IPMC lobes to rapidly bend toward each other and close. The two lobes, which form the trap, are attached to the midrib common electrode which is conveniently termed the spine. The upper surface of each lobe is dished, and spaced along the free margins of the lobes with some  15–20 prong-like teeth. These are tough and pointed, and are inclined at an inward angle so that when the trap is sprung shut they will interlock. We have been experimenting with the VFT closing of its jaw-like lobes that close in about 0.3 s and have gained a lot of knowledge to report on the ionic and electrical mechanisms involved in the operation of such intelligent distributed biomolecular motors.

Twitter Delicious Facebook Digg Stumbleupon Favorites More

Design by Raghu | Protected by - ElectroClub