Intel is introducing at CES several new products and projects focused on wearables:
- Jarvis, a headset that can automatically integrate with a personal assistant app like Siri on a phone without touching it.
- A smartwatch with “geo-fencing” to monitor the person who’s wearing it. For example: in case of an emergency and a person steps out of the geo-fence, the watch can send out an alert.
- Wearable reference devices to accelerate wearable-device innovation, including smart earbuds that provide biometric and fitness capabilities, and a smart wireless charging bowl.
- Collaborations with Barneys New York, the Council of Fashion Designers of America and Opening Ceremony to explore and bring to market new smart wearable technologies, and to increase dialogue and cooperation between the fashion and technology industries.
Intel also introduced Intel Edison, a new Intel Quark technology-based computer housed in an SD card form factor with built-in wireless capabilities and support for multiple operating systems. Intel says it will “enable rapid innovation and product development by a range of inventors, entrepreneurs and consumer product designers when available this summer.”
If you’ve been thinking about getting into 3D printing, the compact MakerBot Replicator Mini 3D printer, just introduced at CES, could make it easy and affordable at $1,375 (available spring 2014).
It’s limited to printing objects around 4 x 4 x 4 inches, but the company claims it’s easy to use, with no 3D skills needed. You can download models from the free MakerBot Printshop and Thingverse, or the MakerBot Digital Store.
MakerBot PrintShop, a free tablet app, lets you send the models wirelessly to the Replicator Mini printer to be built.
MakerBot is also introducing professional models for larger objects: an updated version of the MakerBot Replicator ($2,899) and the MakerBot Replicator Z18 ($6,499), which can print multiple objects at once, up to 18 inches high (spring 2014).
Meiji University professor Hiroshi Nagashima is creating chimeric pigs, which carry genetic material from two different species, BBC News reports. It starts off by making what Nagashima calls “a-pancreatic” embryos. Inside the white pig embryo, the gene that carries the instructions for developing the animal’s pancreas has been “switched off.”
The Japanese team then introduces stem cells from a black pig into the embryo. What they have discovered is that as the pig develops, it will be normal except for its pancreas, which will be genetically a black pig’s.
In a lab at Tokyo University, Professor Hiro Nakauchi is taking the next step. He takes skin cells from an adult brown rat. He then uses gene manipulation to change these adult skin cells into induced pluripotent stem cells (iPS) cells, which can develop into any part of the animal’s body.
Nakauchi has succeeded in using these iPS cells to grow a brown rat pancreas inside a white mouse. He is hoping to develop a technique to take skin cells from a human adult and change them in to iPS cells. Those iPS cells can then be injected into a pig embryo.
Island of Dr. Moreau or the end to organ shortage?
The result, he hopes, will be a pig with a human pancreas or kidney or liver, or maybe even a human heart. Not only that, the organ would be genetically identical to the human from which the skin cells were taken.
This is one of the holy grails of medical research: the ability to reproduce a human organ that is genetically identical to the person who needs it. It could mean an end to donor waiting lists, and an end to problems of organ rejection.
But there are many potential obstacles ahead. The first is that pigs and humans are only distantly related. The other problem is getting approval. In Japan, it is illegal to make human-animal hybrids. And animal rights activists object to the idea of pigs, sheep or goats being used as human organ factories. Many more feel uncomfortable about the idea of pig-human hybrids. It brings to mind H.G. Wells’ sci-fi classic, The Island of Dr Moreau.
Prof Nakauchi said his research is completely different. The pigs would still be pigs; they would just be carrying some human tissue inside them. He said there has always been resistance to new scientific breakthroughs. He points to widespread objections to In Vitro Fertilization (IVF) when it was invented in Britain the 1970s. Today, IVF is used across the world, and no one thinks it is strange or unethical.
Open collaboration — which has brought the world Bitcoin, TEDx and Wikipedia — is likely to lead to new organizations that are not quite non-profits and not quite corporations, according to a paper by Sheen S. Levine of Columbia University and Michael J. Prietula of Emory University published in the journal Organization Science.
The authors define open collaboration as “any system of innovation or production that relies on goal-oriented yet loosely coordinated participants who interact to create a product (or service) of economic value, which they make available to contributors and non-contributors alike.”
Open collaboration emerged with open-source software less than two decades ago. Its underlying principles are now found in many other ventures. Some of them are Internet-based; others are offline, such as TEDx, medicine, and traditional scientific experimentation.
Key points in the paper:
- Open collaboration is likely to expand into new domains, displacing traditional organizations. They suggest that executives and civic leaders should take heed.
- Open collaborations perform well even in seemingly harsh environments, for example, when cooperators are members of a minority group, are “free riders” who tag along, where diversity is lacking, or when goods rival one another.
- Such ventures have been affecting traditional firms, with, for example, Wikipedia supplanting Encyclopedia Britannica as a major general research tool. But despite the impact, the operating principles of open collaboration were opaque. The new research explains how these new organizations operate, and where they are likely to succeed.
Abstract of Organization Science paper
The principles of open collaboration for innovation (and production), once distinctive to open source software, are now found in many other ventures. Some of these ventures are Internet based: for example, Wikipedia and online communities. Others are off-line: they are found in medicine, science, and everyday life. Such ventures have been affecting traditional firms and may represent a new organizational form. Despite the impact of such ventures, their operating principles and performance are not well understood. Here we define open collaboration (OC), the underlying set of principles, and propose that it is a robust engine for innovation and production. First, we review multiple OC ventures and identify four defining principles. In all instances, participants create goods and services of economic value, they exchange and reuse each other’s work, they labor purposefully with just loose coordination, and they permit anyone to contribute and consume. These principles distinguish OC from other organizational forms, such as firms or cooperatives. Next, we turn to performance. To understand the performance of OC, we develop a computational model, combining innovation theory with recent evidence on human cooperation. We identify and investigate three elements that affect performance: the cooperativeness of participants, the diversity of their needs, and the degree to which the goods are rival (subtractable). Through computational experiments, we find that OC performs well even in seemingly harsh environments: when cooperators are a minority, free riders are present, diversity is lacking, or goods are rival. We conclude that OC is viable and likely to expand into new domains. The findings also inform the discussion on new organizational forms, collaborative and communal.
A new system developed by researchers at five institutions, including MIT, could eliminate many limitations in methods to develop detectors that are responsive to a broad range of infrared light. Such detectors could form sensitive imaging arrays for security systems, for example.
The new system works at room temperature and provides a broad infrared response, says associate professor of mechanical engineering Tonio Buonassisi.
It incorporates atoms of gold into the surface of silicon’s crystal structure in a way that maintains the material’s original structure. Additionally, it has the advantage of using silicon, a common semiconductor that is relatively low-cost, easy to process, and abundant.
The approach works by implanting gold into the top hundred nanometers of silicon and then using a laser to melt the surface for a few nanoseconds. The silicon atoms recrystallize into a near-perfect lattice, and the gold atoms don’t have time to escape before getting trapped in the lattice.
Its efficiency is probably too low for use in silicon solar cells, Buonassisi says. However, this laser processing method might be applicable to different materials that would be useful for making solar cells, he says.
The research was funded by the U.S. Army Research Office, the National Science Foundation, the U.S. Department of Energy, and the MIT-KFUPM Center for Clean Water and Energy, a joint project of MIT and the King Fahd University of Petroleum and Mining.
Abstract of Nature Communications paper
Room-temperature infrared sub-band gap photoresponse in silicon is of interest for telecommunications, imaging and solid-state energy conversion. Attempts to induce infrared response in silicon largely centred on combining the modification of its electronic structure via controlled defect formation (for example, vacancies and dislocations) with waveguide coupling, or integration with foreign materials. Impurity-mediated sub-band gap photoresponse in silicon is an alternative to these methods but it has only been studied at low temperature. Here we demonstrate impurity-mediated room-temperature sub-band gap photoresponse in single-crystal silicon-based planar photodiodes. A rapid and repeatable laser-based hyperdoping method incorporates supersaturated gold dopant concentrations on the order of 1020 cm−3 into a single-crystal surface layer ~150 nm thin. We demonstrate room-temperature silicon spectral response extending to wavelengths as long as 2,200 nm, with response increasing monotonically with supersaturated gold dopant concentration. This hyperdoping approach offers a possible path to tunable, broadband infrared imaging using silicon at room temperature.
In the search for cheaper materials that mimic their purer, more expensive counterparts, researchers are abandoning hunches and intuition for theoretical models and pure computing power.
In a new study, researchers from Duke University’s Pratt School of Engineering used computational methods to identify dozens of platinum-group alloys that were previously unknown to science but could prove beneficial in a wide range of applications.
Platinum is expensive, but it’s used to transform toxic fumes leaving a car’s engine into more benign gasses, to produce high octane gasoline, plastics and synthetic rubbers, and to fight the spread of cancerous tumors.
“We’re looking at the properties of ‘expensium’ and trying to develop ‘cheapium,’” said Stefano Curtarolo, director of Duke’s Center for Materials Genomics. “We’re trying to automate the discovery of new materials and use our system to go further faster.”
The research is part of the Materials Genome Initiative launched by President Barack Obama in 2011. The initiative’s goal is to support centers, groups and researchers in accelerating the pace of discovery and deployment of advanced material systems crucial to achieving global competitiveness in the 21st century.
The study appears in the Dec. 30 edition of the American Physical Society journal Physical Review X (open access) and is highlighted in a Viewpoint article in the same issue.
Databases and algorithms to screen thousands of potential materials
The identification of the new platinum-group compounds hinges on databases and algorithms that Curtarolo and his group have spent years developing. Using theories about how atoms interact to model chemical structures from the ground up, Curtarolo and his group screened thousands of potential materials for high probabilities of stability.
After nearly 40,000 calculations, the results identified 37 new binary alloys in the platinum-group metals, which include osmium, iridium ruthenium, rhodium, platinum and palladium.
These metals are prized for their catalytic properties, resistance to chemical corrosion and performance in high-temperature environments, among other properties. Commercial applications for the group include electrical components, corrosion-resistance apparatus, fuel cells, chemotherapy and dentistry. And because of their worldwide scarcity, each metal fetches a premium price.
Now it is up to experimentalists to produce these new materials and discover their physical properties.
In addition to identifying unknown alloys, the study also provides detailed structural data on known materials. For example, there are indications that some may be structurally unstable at low temperatures. This isn’t readily apparent because creating such materials is difficult, requiring high temperatures or pressures and very long equilibration processes.
“We hope providing a list of targets will help identify new compounds much faster and more cheaply,” said Curtarolo. “Physically going through these potential combinations just to find the targets would take 200 to 300 graduate students five years. As it is, characterizing the targets we identified should keep the experimentalists busy for 20.”
This research was supported by the DOD-ONR and the NSF.
Abstract of Physical Review X paper
We report a comprehensive study of the binary systems of the platinum-group metals with the transition metals, using high-throughput first-principles calculations. These computations predict stability of new compounds in 28 binary systems where no compounds have been reported in the literature experimentally and a few dozen of as-yet unreported compounds in additional systems. Our calculations also identify stable structures at compound compositions that have been previously reported without detailed structural data and indicate that some experimentally reported compounds may actually be unstable at low temperatures. With these results, we construct enhanced structure maps for the binary alloys of platinum-group metals. These maps are much more complete, systematic, and predictive than those based on empirical results alone.
Self-driving vehicles offer the promise of significant benefits to society, but raise several policy challenges, including the need to update insurance liability regulations and privacy concerns such as who will control the data generated by this technology, according to a new RAND Corporation study.
“Our research finds that the social benefits of autonomous vehicles — including decreased crashes, increased mobility and increases in fuel economy — will outweigh the likely disadvantages,” said James Anderson, lead author of the study and a senior behavioral scientist at RAND, a nonprofit research organization.
The study, intended as a guide for state and federal policymakers, explores communications, regulatory challenges and liability issues raised by autonomous vehicle technology.
Several states (Nevada, Florida, California, Minnesota) as well as Washington, D.C., have already created laws to regulate the use of autonomous vehicle technology, the researchers found. Other states also have proposed legislation. Unfortunately, this could lead to a patchwork of conflicting regulatory requirements that vary from state to state, which could undermine potential benefits, Anderson said.
Reducing crashes, energy consumption, pollution, and congestion
Cars and light vehicles equipped with this technology will likely reduce crashes, energy consumption and pollution, as well as cut costs associated with congestion. According to the Insurance Institute for Highway Safety, nearly a third of all crashes could be prevented if all vehicles had forward collision and lane-departure warning systems, side-view (blind spot) assistance and adaptive headlights. And as of March 2013, Google had logged more than 500,000 miles of autonomous driving on public roads with its driverless car without incurring a crash.
Autonomous vehicles have the potential to provide increased mobility for the elderly, the disabled and the blind. The costs associated with traffic congestion could be reduced because riders could do other tasks in transit.
Fully autonomous cars also could improve land use in several ways. Currently, about 31 percent of the space in the central business districts of 41 major cities is dedicated to parking, but autonomous vehicles would be able to drop passengers off, and then drive themselves to remote, satellite parking lots. The technology also might reduce car ownership and promote ride-sharing.
However, Anderson said that many of the benefits will accrue to parties other than the technology’s purchasers. These positive societal effects may justify some form of government subsidy to encourage more consumers to use the new technology.
Issues to work through
Negative consequences include the possibility that the technology may encourage greater travel and increase total vehicle miles traveled, leading to more congestion. If autonomous vehicle software becomes standardized, a single flaw could lead to many accidents. Internet-connected systems might be hacked by the malicious.
Researchers say there also are a number of issues that car manufacturers and policymakers will have to work through before the widespread use of driverless vehicles becomes common. While technology can sense and react quicker than humans, it is not as good at interpreting data. For example, is that object in the road a deer, a cardboard box or a bicycle? Weather, terrain and roadway signage vary across the United States — will a vehicle be able to perform in a dry, flat climate as well as a snowy one with steep hills?
Car manufacturers also will have to deal with the issue of sensor failure, Anderson said. Designing a system that recognizes when a sensor is not transmitting any information is easier than developing one that can determine when a sensor is throwing out intermittent or nonsensical data. Developing the infrastructure to allow systems such as traffic signals to communicate with these cars also will be complex and potentially costly, and making sure the technology is secure from hackers is another concern. Finally, despite growing interest in autonomous vehicle technology, it may be too expensive for widespread adoption.
The study recommends the following policy considerations:
- Policymakers should avoid passing regulations prematurely while the technology is still evolving.
- Distracted-driving laws will need to be updated to incorporate autonomous vehicle technology.
- Policymakers should clarify who will own the data generated by this technology and how it will be used, and address privacy concerns.
- Regulations and liability rules should be designed by comparing the performance of autonomous vehicles to that of average human drivers and the long-term benefits of the technology should be incorporated into determinations of liability.
The study, “Autonomous Vehicle Technology: A Guide for Policymakers,” can be found at www.rand.org.
Research for the study was funded by the RAND Initiated Research Program, using discretionary funds made possible in part by the generosity of philanthropic donors to RAND. The program is designed to fund projects that are likely to be important in the future but for which existing outside sources of funding are limited.
Abstract of RAND Corporation study
For the past hundred years, innovation within the automotive sector has created safer, cleaner, and more affordable vehicles, but progress has been incremental. The industry now appears close to substantial change, engendered by autonomous, or “self-driving,” vehicle technologies. This technology offers the possibility of significant benefits to social welfare — saving lives; reducing crashes, congestion, fuel consumption, and pollution; increasing mobility for the disabled; and ultimately improving land use. This report is intended as a guide for state and federal policymakers on the many issues that this technology raises. After surveying the advantages and disadvantages of the technology, RAND researchers determined that the benefits of the technology likely outweigh the disadvantages. However, many of the benefits will accrue to parties other than the technology’s purchasers. These positive externalities may justify some form of subsidy. The report also explores policy issues, communications, regulation and standards, and liability issues raised by the technology; and concludes with some tentative guidance for policymakers, guided largely by the principle that the technology should be allowed and perhaps encouraged when it is superior to an average human driver.
Tel Aviv University researchers have developed a computer algorithm that predicts which genes can be “turned off” to create the same anti-aging effect as calorie restriction*. The findings, reported in Nature Communications, could lead to the development of new drugs to treat aging.
“Most algorithms try to find drug targets that kill cells to treat cancer or bacterial infections,” says Keren Yizhak, a doctoral student in Prof. Eytan Ruppin’s laboratory. “Our algorithm is the first in our field to look for drug targets not to kill cells, but to transform them from a diseased state into a healthy one.”
Ruppin’s lab is a leader in the growing field of genome-scale metabolic modeling or GSMMs. Using mathematical equations and computers, GSMMs describe the metabolism, or life-sustaining, processes of living cells. Yizhak’s algorithm, which she calls a “metabolic transformation algorithm,” or MTA, can take information about any two metabolic states and predict the environmental or genetic changes required to go from one state to the other.
“Gene expression” is the measurement of the expression level of individual genes in a cell, and genes can be “turned off” in various ways to prevent them from being expressed in the cell. In the study, Yizhak applied MTA to the genetics of aging. After using her custom-designed MTA to confirm previous laboratory findings, she used it to predict genes that can be turned off to make the gene expression of old yeast look like that of young yeast. Yeast is the most widely used genetic model because much of its DNA is preserved in humans.
Some of the genes that the MTA identified were already known to extend the lifespan of yeast when turned off. Of the other genes she found, Yizhak sent seven to be tested at a Bar-Ilan University laboratory. Researchers there found that turning off two of the genes, GRE3 and ADH2, in actual (non-digital) yeast significantly extends the yeast’s lifespan.
“You would expect about three percent of yeast’s genes to be lifespan-extending,” said Yizhak. “So achieving a 10-fold increase over this expected frequency, as we did, is very encouraging.”
Hope for humans
Since MTA provides a systemic view of cell metabolism, it can also shed light on how the genes it identifies contribute to changes in genetic expression. In the case of GRE3 and ADH2, MTA showed that turning off the genes increased oxidative stress levels in yeast, thus possibly inducing a mild stress similar to that produced by calorie restriction.
As a final test, Yizhak applied MTA to human metabolic information. MTA was able to identify a set of genes that can transform 40 to 70 percent of the differences between the old and young information from four different studies. While currently there is no way to verify the results in humans, many of these genes are known to extend lifespan in yeast, worms, and mice.
Next, Yizhak will study whether turning off the genes predicted by MTA prolongs the lifespan of genetically engineered mice.
One day, drugs could be developed to target genes in humans, potentially allowing us to live longer. MTA could also be applied to finding drug targets for disorders where metabolism plays a role, including obesity, diabetes, neurodegenerative disorders, and cancer.
* Restricting calorie consumption is one of the few proven ways to combat aging. Though the underlying mechanism is unknown, calorie restriction has been shown to prolong lifespan in yeast, worms, flies, monkeys, and, in some studies, humans.
Abstract of Nature Communications paper
The growing availability of ‘omics’ data and high-quality in silico genome-scale metabolic models (GSMMs) provide a golden opportunity for the systematic identification of new metabolic drug targets. Extant GSMM-based methods aim at identifying drug targets that would kill the target cell, focusing on antibiotics or cancer treatments. However, normal human metabolism is altered in many diseases and the therapeutic goal is fundamentally different—to retrieve the healthy state. Here we present a generic metabolic transformation algorithm (MTA) addressing this issue. First, the prediction accuracy of MTA is comprehensively validated using data sets of known perturbations. Second, two predicted yeast lifespan-extending genes, GRE3 and ADH2, are experimentally validated, together with their associated hormetic effect. Third, we show that MTA predicts new drug targets for human ageing that are enriched with orthologs of known lifespan-extending genes and with genes downregulated following caloric restriction mimetic treatments. MTA offers a promising new approach for the identification of drug targets in metabolically related disorders.
Self-driving cars (SDC) that include driver control are expected to hit highways around the globe before 2025 and self-driving “only” cars (only the car drives) are anticipated around 2030, according to an emerging technologies study on Autonomous Cars from IHS Automotive.
In the study, “Emerging Technologies: Autonomous Cars — Not If, But When,” IHS Automotive forecasts total worldwide sales of self-driving cars will grow from nearly 230 thousand in 2025 to 11.8 million in 2035 — 7 million SDCs with both driver control and autonomous control and 4.8 million that have only autonomous control.
In all, there should be nearly 54 million self-driving cars in use globally by 2035.
The study anticipates that nearly all of the vehicles in use are likely to be self-driving cars or self-driving commercial vehicles sometime after 2050.
The price premium for the SDC electronics technology will add between $7,000 and $10,000 to a car’s sticker price in 2025, a figure that will drop to around $5,000 in 2030 and about $3,000 in 2035 when no driver controls are available.
Benefits of self-driving cars
“There are several benefits from self-driving cars to society, drivers and pedestrians,” says Egil Juliussen, principal analyst for infotainment and autonomous driver assisted systems at IHS Automotive. Juliussen co-authored the study with IHS Automotive senior ADAS analyst Jeremy Carlson.
“Accident rates will plunge to near zero for SDCs, although other cars will crash into SDCs; but as the market share of SDCs on the highway grows, overall accident rates will decline steadily,” Juliussen says. “Traffic congestion and air pollution per car should also decline because SDCs can be programmed to be more efficient in their driving patterns.”
The study also notes some potential barriers to SDC deployment and two major technology risks: software reliability and cyber security. The barriers include implementation of a legal framework for self-driving cars and establishment of government rules and regulations.
Autonomous cars by 2020
Several automakers have said publicly they will have autonomous cars by 2020, or earlier. Autonomous car technology is already affecting driver assist systems such as adaptive cruise control, lane keep assist, and collision mitigating brake systems.
Additionally, the IHS study says the first group of autonomous cars will have so-called Level 3 capability — limited self-driving that enables the driver to cede full control of all safety-critical functions under certain traffic and environmental conditions and includes auto pilot for highway travel and parking.
Coming later in the decade will be SDCs with Level 4 capability — self-driving but with human controls.
North America is forecasted to account for 29 percent of worldwide sales of self-driving cars with human controls (level 4) and self-driving only cars (level 5) in 2035, or nearly 3.5 million vehicles. China will capture the second largest share at 24 percent, or more than 2.8 million units, while Western Europe will account for 20 percent of the total, 2.4 million vehicles.
UPDATE Jan 3, 2014
KurzweilAI asked Dr. Egil Juliussen, principal analyst for infotainment and autonomous driver assisted systems at IHS Automotive, to comment on the following questions.
How do these forecasts on on self-driving cars compare to other ones?
EG: I am not aware of any other forecasts on self-driving cars. It is a little early to make such forecasts. This was done as a part of a large multi-client study called “New Urban Mobility” that looked at the major aspects of the auto industry until 2035. It became clear that self-driving cars would have a growing impact on the auto industry and I spent several months researching the technology and other elements of self-driving cars. The report that was released was extracted from this project.
Is Google in agreement with the dates? I have the impression that they are anticipating earlier dates for level 5 [fully self-driving].
EG: I don’t know if Google agrees. Hopefully they will buy the report and give us some feedback. Google is ahead of the auto OEMs today, and is likely to be a technology supplier at some point. The Level 5 or self-driving-only car (without driver control) has not been discussed much and we believe it will come 5 years later than the SDC [self-driving car] that can also be controlled by a driver. I would think that Google will be thinking about this too, but they have not said anything about this topic.
What about highway and road redesign to facilitate level 4 and 5? Is it necessary and if so, for what types of roads?
We did not address this topic, but it is an interesting question. The SDC has to be designed to drive on existing road infrastructure. Over time it is possible that the road infrastructure may be modified, but that is a long time off.
Is this device above — part of a quantum-computing research project at the Laboratory for Physical Sciences in Maryland — the core of a future NSA quantum computer for cracking nearly every kind of encryption used to protect banking, medical, business and government records around the world?
According to documents provided by former NSA contractor Edward Snowden, the effort to build “a cryptologically useful quantum computer” is part of a $79.7 million research program titled “Penetrating Hard Targets,” The Washington Post revealed Thursday.
Much of the work is hosted under classified contracts at this laboratory, the newspaper says. On its website, the lab mentions its work on quantum mechanical effects in superconducting circuits at a temperature of about 10 mK (barely above absolute zero), using a superconducting charge qubit or Cooper-pair box.
According to the Post, the NSA is expected to be able to have some building blocks by the end of September, which it described in a document as “dynamical decoupling and complete quantum control on two semiconductor qubits.”
Another project, called “Owning the Net,” aims at quantum-based attacks on encryptions, the Post says.
“The application of quantum technologies to encryption algorithms threatens to dramatically impact the US government’s ability to both protect its communications and eavesdrop on the communications of foreign governments,” the Post said, based on an internal document provided by Snowden.
An entirely new approach to measuring body temperature — an ”electronic skin” that adheres non-invasively to human skin, conforms well to contours, and provides a detailed temperature map of any surface of the body — has been developed by an international multidisciplinary team including researchers at the University of Illinois at Urbana/Champaign and the National Institute of Biomedical Imaging and Bioengineering (NIBIB).
Subtle variations in temperature can indicate potentially harmful underlying conditions such as constriction or dilation of blood vessels, or dehydration. Even changes in mental activity, such as increased concentration while solving a mathematical equation, are accompanied by measureable changes in body temperature.
Detecting skin temperature changes can serve as early indicators of disease development and progression. For example, sophisticated infrared digital cameras can detect, in high resolution, temperature changes across large areas of the body. At the other end of the technology spectrum, paste-on temperature sensors provide simple, single-point measurements. Although both technologies are accurate, infrared cameras are expensive and require the patient to remain completely still, and while paste-on sensors allow free movement, they provide limited information.
How it works
The temperature sensor array is a variation of a novel technology, originally developed in the lab of Professor John Rogers at the University of Illinois at Urbana/Champaign, called “epidermal electronics,” consisting of ultrathin, flexible skin-like arrays that contain sensors and heating elements. The arrays resemble a tattoo of a micro-circuit board.
The technology offers the potential for a wide range of diagnostic and therapeutic capabilities with little patient discomfort. For example, sensors can be incorporated that detect different metabolites of interest. Similarly, the heaters can be used to deliver heat therapy to specific body regions; actuators can be added that deliver an electrical stimulus or even a specific drug. Future versions will have a wireless power coil and an antenna for remote data transfer.
Testing the new device
In this study, the array contained heat sensors so that it could be tested for its ability to accurately detect variations in localized skin temperature when compared to the “gold standard,” the infrared camera. The profiles of temperature changes were virtually identical with the two methods.
The investigators also performed a test that is used as a cardiovascular screening procedure. Blood flow changes are detected by changes in skin temperature as blood moves through the forearm while a blood pressure cuff on the upper arm is inflated and deflated. Once again, the infrared camera and the array technology showed virtually identical temperature change profiles. Temperature was reduced when blood flow was blocked and it increased as blood was released. Slow return of blood to the forearm can indicate potential cardiovascular abnormalities.
This experiment demonstrated that the device could potentially be used as a rapid screening tool to determine whether an individual should be further tested for disorders, such as diabetes or cardiovascular disease, that cause abnormal peripheral blood flow. It could also be a signal to doctors and patients about effects of certain medications.
The final experiment addressed a feature unique to the skin array technology: delivery of a stimulus, such as heat. The researchers sent precise pulses of heat to the skin to measure skin perspiration, which indicates a person’s overall hydration. Taken together, the test results demonstrated the ability of the array technology to obtain a range of accurate, clinically useful measurements, and deliver specific stimuli, with a single, convenient, and relatively inexpensive device.
Other potential applications
In addition to heat, any type of sensors could be included, such as sensors that reveal glucose levels, blood oxygen content, blood cell counts, or levels of a circulating medication. Also, an element could be included in the circuit that delivers a medication, an essential micro-nutrient, or various stimuli to promote rapid wound healing. This ability to sense and deliver a wide range of stimuli makes the system useful for diagnostic, therapeutic and experimental purposes.
The technology has the potential to carry out such therapeutic and diagnostic functions while patients go about their daily business, with the data being delivered remotely via a cell phone to a physician – saving the expense of obtaining the same diagnostic measurements, or performing the same therapeutic stimulus, in the clinic.
Alexander Gorbach, Ph.D., one of the co-investigators from NIBIB, and head of the Infrared Imaging and Thermometry Unit, says, “We are very excited about the unique potential of this technology to vastly improve healthcare at multiple levels. Continuous monitoring outside of a hospital setting will be more convenient and cost-effective for patients. Additionally, access to data collected over extended periods, while a patient is going about a normal routine, should improve the practice of medicine by enabling physicians to adjust a treatment regimen ‘24/7’ as needed.”
The investigators are already receiving requests from other clinical research labs to use this technology, and plan to expand collaboration with academia and industry. The hope is that the research community’s interest in epidermal electronics will accelerate the development and validation of this technology and hasten its incorporation into clinical care.
Abstract of Nature Materials paper
Precision thermometry of the skin can, together with other measurements, provide clinically relevant information about cardiovascular health, cognitive state, malignancy and many other important aspects of human physiology. Here, we introduce an ultrathin, compliant skin-like sensor/actuator technology that can pliably laminate onto the epidermis to provide continuous, accurate thermal characterizations that are unavailable with other methods. Examples include non-invasive spatial mapping of skin temperature with millikelvin precision, and simultaneous quantitative assessment of tissue thermal conductivity. Such devices can also be implemented in ways that reveal the time-dynamic influence of blood flow and perfusion on these properties. Experimental and theoretical studies establish the underlying principles of operation, and define engineering guidelines for device design. Evaluation of subtle variations in skin temperature associated with mental activity, physical stimulation and vasoconstriction/dilation along with accurate determination of skin hydration through measurements of thermal conductivity represent some important operational examples.
Two teams of scientists using NASA’s Hubble Space Telescope report they have characterized the atmospheres of a pair of planets with masses intermediate between gas giants, like Jupiter, and smaller, rockier planets, like Earth.
A survey by NASA’s Kepler space telescope mission previously showed that objects in this size range are among the most common type of planets in our Milky Way galaxy. The researchers described their work as an important milestone on the road to characterizing potentially habitable, Earth-like worlds beyond the solar system.
The two planets studied are known as GJ 436b and GJ 1214b. GJ 436b is categorized as a “warm Neptune” because it is much closer to its star than frigid Neptune is to our Sun. The planet is located 36 light-years away in the constellation Leo.
GJ 1214b is known as a “super-Earth” type planet. Super-Earths are planets with masses between that of Earth and Neptune. Because no such planet exists in our solar system, the physical nature of super-Earths is largely unknown. GJ1214b is located just 40 light-years from Earth, in the constellation Ophiuchus.
Both GJ 436b and GJ 1214b can be observed passing in front of, or transiting, their parent stars. This provides an opportunity to study these planets in more detail as starlight filters through their atmospheres.
An atmospheric study of GJ 436b based on such transit observations with Hubble over the last year is presented in one of the papers, led by Heather Knutson of the California Institute of Technology in Pasadena, Calif. The news is about what they didn’t find. The Hubble spectra were featureless and revealed no chemical fingerprints whatsoever in the planet’s atmosphere.
“Either this planet has a high cloud layer obscuring the view, or it has a cloud-free atmosphere that is deficient in hydrogen, which would make it very unlike Neptune,” said Knutson. “Instead of hydrogen, it could have relatively large amounts of heavier molecules such as water vapor, carbon monoxide, and carbon dioxide, which would compress the atmosphere and make it hard for us to detect any chemical signatures.”
Evidence of high clouds
Observations similar to those obtained for GJ 436b had been previously obtained for GJ 1214b. The first spectra of this planet were also featureless and presented a similar puzzle: The planet’s atmosphere either was predominantly water vapor or hydrogen-dominated with high-altitude clouds.
A team of astronomers led by Laura Kreidberg and Jacob Bean of the University of Chicago used Hubble to obtain a deeper view of GJ 1214b that revealed what they consider definitive evidence of high clouds blanketing the planet. These clouds hide any information about the composition and behavior of the lower atmosphere and surface. The new Hubble spectra also revealed no chemical fingerprints whatsoever in the planet’s atmosphere, but the high precision of the new data enabled them to rule out cloud-free compositions of water vapor, methane, nitrogen, carbon monoxide, or carbon dioxide for the first time.
“Both planets are telling us something about the diversity of planet types that occur outside of our own solar system; in this case we are discovering that we may not know them as well as we thought,” said Knutson. “We’d really like to determine the size at which these planets transition from looking like mini-gas giants to something more like a water world or a rocky, scaled-up version of the Earth. Both of these observations are fundamentally trying to answer that question.”
Models of GJ 436b and GJ 1214b predict clouds that could be made out of potassium chloride or zinc sulfide at the scorching temperatures of several hundred degrees Fahrenheit predicted to be found in these atmospheres. “You would expect very different kinds of clouds to form on these planets than you would find, say, on Earth,” said Kreidberg.
The Chicago team had to make a big effort to conclusively determine the nature of GJ 1214b’s cloudy atmosphere. Kreidberg explained, “We really pushed the limits of what is possible with Hubble to make this measurement — our work devoted more Hubble time to a single exoplanet than ever before. This advance lays the foundation for characterizing other Earths with similar techniques.
“Looking forward, the James Webb Space Telescope will be transformative. The new capabilities of this telescope will allow us to peer through the clouds on GJ 1214b and similar exoplanets.”
Abstract of Nature paper (Laura Kreidberg et al.)
Recent surveys have revealed that planets intermediate in size between Earth and Neptune (‘super-Earths’) are among the most common planets in the Galaxy. Atmospheric studies are the next step towards developing a comprehensive understanding of this new class of object. Much effort has been focused on using transmission spectroscopy to characterize the atmosphere of the super-Earth archetype GJ 1214b, but previous observations did not have sufficient precision to distinguish between two interpretations for the atmosphere. The planet’s atmosphere could be dominated by relatively heavy molecules, such as water (for example, a 100 per cent water vapour composition), or it could contain high-altitude clouds that obscure its lower layers. Here we report a measurement of the transmission spectrum of GJ 1214b at near-infrared wavelengths that definitively resolves this ambiguity. The data, obtained with the Hubble Space Telescope, are sufficiently precise to detect absorption features from a high mean-molecular-mass atmosphere. The observed spectrum, however, is featureless. We rule out cloud-free atmospheric models with compositions dominated by water, methane, carbon monoxide, nitrogen or carbon dioxide at greater than 5σ confidence. The planet’s atmosphere must contain clouds to be consistent with the data.
Abstract of Nature paper (Heather A. Knutson et al.)
GJ 436b is a warm — approximately 800 kelvin — exoplanet that periodically eclipses its low-mass (half the mass of the Sun) host star, and is one of the few Neptune-mass planets that is amenable to detailed characterization. Previous observations have indicated that its atmosphere has a ratio of methane to carbon monoxide that is 105 times smaller than predicted by models for hydrogen-dominated atmospheres at these temperatures. A recent study proposed that this unusual chemistry could be explained if the planet’s atmosphere is significantly enhanced in elements heavier than hydrogen and helium. Here we report observations of GJ 436b’s atmosphere obtained during transit. The data indicate that the planet’s transmission spectrum is featureless, ruling out cloud-free, hydrogen-dominated atmosphere models with an extremely high significance of 48σ. The measured spectrum is consistent with either a layer of high cloud located at a pressure level of approximately one millibar or with a relatively hydrogen-poor (three per cent hydrogen and helium mass fraction) atmospheric composition.
Rice University researchers have developed a noninvasive technology that accurately detects even a single malaria-infected cell among a million normal cells through the skin in seconds with a laser scanner.
The “vapor nanobubble” technology requires no dyes or diagnostic chemicals, there is no need to draw blood, and there are zero false-positive readings.
The diagnosis and screening will be supported by a low-cost, battery-powered portable device that can be operated by non-medical personnel. One device should be able to screen up to 200,000 people per year, with the cost of diagnosis estimated to be below 50 cents, the researchers say.
The new diagnostic technology uses a low-powered laser that creates tiny vapor “nanobubbles” inside malaria-infected cells. The bursting bubbles have a unique acoustic signature that allows for an extremely sensitive diagnosis.
The transdermal diagnostic method takes advantage of the optical properties and nanosize of hemozoin, a nanoparticle produced by a malaria parasite inside red blood cell. Hemozoin crystals are not found in normal red blood cells.
Lead investigator Dmitri Lapotko, a Rice scientist who invented the vapor nanobubble technology, a faculty fellow in biochemistry and cell biology and in physics and astronomy at Rice, and lead co-author Ekaterina Lukianova-Hleb found that hemozoin absorbs the energy from a short laser pulse and creates a transient vapor nanobubble.
This short-lived vapor nanobubble emerges around the hemozoin nanoparticle and is detected both acoustically and optically. In the study, the researchers found that acoustic detection of nanobubbles made it possible to detect malaria with extraordinary sensitivity.
“Ours is the first through-the-skin method that’s been shown to rapidly and accurately detect malaria in seconds without the use of blood sampling or reagents,’ said Lapotko
Lapotko said the first trials of the technology in humans are expected to begin in Houston in early 2014.
Malaria, one of the world’s deadliest diseases, sickens more than 300 million people and kills more than 600,000 each year, most of them young children. Despite widespread global efforts, malaria parasites have become more resistant to drugs, and efficient epidemiological screening and early diagnosis are largely unavailable in the countries most affected by the disease.
Inexpensive rapid diagnostic tests exist, but they lack sensitivity and reliability. The gold standard for diagnosing malaria is a “blood smear” test, which requires a sample of the patient’s blood, a trained laboratory technician, chemical reagents and high-quality microscope. These are often unavailable in low-resource hospitals and clinics in the developing world.
Abstract of Proceedings of the National Academy of Sciences paper
Successful diagnosis, screening, and elimination of malaria critically depend on rapid and sensitive detection of this dangerous infection, preferably transdermally and without sophisticated reagents or blood drawing. Such diagnostic methods are not currently available. Here we show that the high optical absorbance and nanosize of endogenous heme nanoparticles called “hemozoin,” a unique component of all blood-stage malaria parasites, generates a transient vapor nanobubble around hemozoin in response to a short and safe near-infrared picosecond laser pulse. The acoustic signals of these malaria-specific nanobubbles provided transdermal noninvasive and rapid detection of a malaria infection as low as 0.00034% in animals without using any reagents or drawing blood. These on-demand transient events have no analogs among current malaria markers and probes, can detect and screen malaria in seconds, and can be realized as a compact, easy-to-use, inexpensive, and safe field technology.
New research published online first in the Jan. 1 Journal of the American Medical Association suggests that alpha tocepherol (fat-soluble Vitamin E and antioxidant), may slow functional decline — problems with daily activities such as shopping, preparing meals, planning, and traveling — in patients with mild-to-moderate Alzheimer’s disease and decrease caregiver burden.
Vitamin E did not show delay of cognitive or memory deterioration in the research.
“Since the cholinesterase inhibitors [galantamine, donepezil, rivastigmine], we have had very little to offer patients with mild-to-moderate dementia,” said Mary Sano, PhD, trial co-investigator, and professor in the department of psychiatry, Icahn School of Medicine at Mount Sinai, and director of research at the James J. Peters Veteran’s Administration Medical Center, Bronx, New York. “This trial showed that vitamin E delays progression of functional decline by 19% per year, which translates into 6.2 months benefit over placebo.”
The clinical trial investigators believe vitamin E can be recommended as a treatment strategy, based on the double-blind randomized controlled trial.
The Veteran’s Administration Cooperative Randomized Trial of Vitamin E and memantine in Alzheimer’s Disease (TEAM-AD examined the effects of vitamin E 2,000 IU/d, 20 mg/d of memantine, the combination, or placebo on Alzheimer’s Disease Cooperative Study/Activities of Daily Living (ADCS-ADL) Inventory Score. A group of 613 patients with mild to moderate Alzheimer’s disease were in the study, which was launched in August 2007 and finished in September 2012 at 14 Veterans Affairs Medical Centers.
Difficulty with activities of daily living often affect Alzheimer’s patients, which is estimated to affect as many as 5.1 million Americans.
The OpenWorm Project — an open-source project dedicated to creating a virtual C. elegans nematode in a computer by reverse-engineering its biology— has now developed software that replicates the worm’s muscle movement.
The ultimate scientific goal of OpenWorm: understanding how the worm brain works via a full digital simulation.
This is a video of a prototype of the C. elegans swimming-locomotion simulation.
Tech predictions for businesses in 2014: mobility, wearables, intelligent assistants, gestural computing, facial recognition
J. P. Gownder, vice president and principal analyst at Forrester Research serving Infrastructure & Operations Professionals offers these predictions for 2014 for businesses:
- Mobility: Customers will actively shun businesses that lack mobile applications to enrich their experiences.
- Wearables: will come to the enterprise, often in customer-facing situations. Google Glass could be the next big App Platform.
- Intelligent assistants: Intelligent agents like Siri and Watson will start to look more useful, interesting, and easier. They’ll help people shop, manage calendars, and surprise users by mining personal data. They’ll start to reshape the way we compute altogether.
- Gestural computing: finally hitting the big time, with new applications, including manipulating and navigating medical imaging.
- Facial recognition in stores: You’ll walk into a store and it “knows you” and customizes your shopping, as Tesco is doing with facial recognition software that allows it to serve up appropriate ad content in its stores. But will shoppers approve?
In Guangdong Province in Southern China, ten transgenic piglets have been born this year, in and under a black light, they glow a greenish tint.
A technique developed by reproductive scientists from the University of Hawai`i at Mānoa’s John A. Burns School of Medicine was used to quadruple the success rate at which plasmids carrying a fluorescent protein from jellyfish DNA were transferred into the embryo of the pig.
The green color is a marker that indicates that the fluorescent genetic material injected into the pig embryos has been incorporated into the animal’s natural make-up.
The ultimate goal is to introduce beneficial genes into larger animals to create less costly and more efficient medicines.
The IBR technique involves proprietary pmgenie-3 plasmids conferring active integration during cytoplasmic injection. This technique was also used to produce the world’s first “glowing green rabbits” in Turkey earlier this year. Turkey is expected to announce results of similar research involving sheep in the New Year.
In the video below, the pigs — not unlike human children afraid of the dark — begin to squeal when the lights are turned off, except for the black light, which helps illuminate the green tint. The noise is because the scientists are holding the by-now-large piglets in a container to prevent their movement, to make the florescent glow most visible.
Medtronic, Inc. has announced the first-in-human implant of the world’s smallest pacemaker: the Micra Transcatheter Pacing System (TPS).
The device was implanted in a patient in Linz, Austria as part of the Medtronic global pivotal clinical trial. The Micra TPS is an investigational device worldwide.
At one-tenth the size of a conventional pacemaker, and comparable in size to a large vitamin, the Micra TPS is delivered directly into the heart through a catheter inserted in the femoral vein. Once positioned, the pacemaker is securely attached to the heart wall and can be repositioned if needed.
No leads, no incision, minimally invasive
Unlike current devices, the new miniature device does not require the use of wires, known as “leads,” to connect to the heart. Attached to the heart via small tines, the pacemaker delivers electrical impulses that pace the heart through an electrode at the end of the device.
“Because of its small size and unique design, the Micra TPS can be introduced directly into the heart via a minimally invasive procedure, without the need for leads,” said Clemens Steinwender, M.D., head of cardiology at the Linz General Hospital in Linz, Austria. “The combination of this novel technology with a transcatheter procedure can benefit patients by potentially reducing pocket or lead complications and recovery times observed with traditional surgical pacemaker implants.”
In contrast to current pacemaker implant procedures, the Micra TPS implant also does not require a surgical incision in the chest and the creation of a “pocket” under the skin. This eliminates a potential source of device-related complications, and any visible sign of the device.
Micra TPS Study Design
The study is a single-arm, multi-center global clinical trial that will enroll up to 780 patients at approximately 50 centers. Initial results from the first 60 patients, followed up to three months, are expected in the second half of 2014.
H/T: Michael Weiner
RIKEN has announced plans to develop a new exascale supercomputer, meaning it will compute at least one quintillion (a million trillion) floating point operations per second — 30 times faster than the current fastest supercomputer, China’s Tianhe-2.
The new supercomputer is scheduled to begin working in 2020.
Funded by the Ministry of Education, Culture, Sports, Science and Technology of Japan, it is expected to “keep Japan at the leading edge of computing science and technology,” RIKEN said in a statement.
The new system will be about 100 times faster than the RIKEN-developed K computer, formerly the fastest supercomputer in the world in 2011. RIKEN was selected for the new project based on its experience developing and managing the K computer, the research institution says. The RIKEN Advanced Institute for Computational Science (AICS) will continue to operate and manage the K computer.
The Tianhe-2, developed by China’s National University of Defense Technology, has an Rmax performance of 33.86 petaflop/s, according to the Top500 list of supercomputers.
Exascale supercomputing is expected to make possible high-resolution simulations, contributing to advances in a wide range of areas including drug discovery, weather forecasting, and astrophysics.