Category

Technology

Category

Hungary has become a leader in the European tech scene, ranking among the top 5 tech economies for VC investment in Europe.

With tech innovation now one of the top ten industries of the country, Hungarian firms are not only looking to spread their influence throughout Europe, but to overseas markets as well.

One of these companies is the Budapest-based startup accelerator, Traction Tribe, whose innovative hybrid approach and reputation for careful mentorship have distinguished its accelerator program on the European scene

“Hungary is quietly building one of the stronger tech economies in Europe,” said Antal Karolyi, Traction Tribe partner and head of growth, “There’s a long history of tech and science focus here.”

The next step for Hungary, says Karolyi, is to expand globally. Traction Tribe seeks to do so by providing European startups with access to and firsthand preparation for business opportunities in the U.S. tech market. Startups go through a 3-6 month accelerator mentorship, with early seed funding of up to $100,000, training from experts with on the ground knowledge of the U.S. investment economy, and the potential for further funding after completion.

“This isn’t about exporting our talent,” says Karolyi, “so much as exposing Hungary to the world and the world to Hungary.”

With all of the mobile apps, Web sites and free services clamoring for your personal data, whereabouts and preferences, it might seem as though privacy is at death’s door. Not so. Several projects underway aim to provide virtual lockboxes or screening technologies that can help people to reassert control over their digital lives.
 
In general, privacy-enhancing approaches to online data sharing require any company, app developer or government agency that wants to know more about you to ask permission for access to specific information. The rest of your data stay locked away.
 
Services such as Britain’s Mydex offer the ability to store, manage and share personal information in an encrypted central repository called a data store, which only the person who creates the store can fully access. Anyone wanting information contained within that personal data store—an insurance company or marketer, for example—must connect to Mydex’s network and agree to terms of use created by the person owning the data before Mydex will release it. Personal, based in Washington, D.C., offers a similar “data vault” service.
 
A system called the open personal data store (openPDS) platform, under development at the Massachusetts Institute of Technology, likewise consolidates information into a single location that can be stored on one’s computer or with a service provider (aka in the “cloud”). OpenPDS, however, deals specifically with metadata—which can describe a person’s location, phone use or Web searches, for example. The M.I.T. approach protects privacy by refusing to share any of that data directly. Instead, a mobile app, Web site or research firm looking for information protected by an openPDS must query the data store directly—to check, for instance, whether your shipping address has changed or to confirm your present location. OpenPDS responds specifically to that query with answers that the openPDS owner approves for release, according to a study published July 9 in PLOS ONE.
 
Simply anonymizing records by stripping out names and other identifying information is not enough protect one’s privacy, says Yves-Alexandre de Montjoye, an M.I.T. graduate student in media arts and sciences and first author on the new paper. de Montjoye and colleagues at M.I.T. and the Catholic University of Louvain in Belgium have demonstrated in past experiments that as few as four pieces of data from a person’s mobile phone checking in with the nearest cell tower is enough for them to identify the owner of that phone 95 percent of the time. The researchers are now testing openPDS with telecommunications companies in Italy and Denmark.
 
A service in the Netherlands called the Qiy Trust Framework offers a slightly different model for privacy protection. Since 2012 people that country have been able to create their own online portal for organizing and protecting the personal information they give to utility companies, government agencies and businesses. Unlike MyDex or openPDS, Qiy is not a data repository—any data that someone gives to, say, a wireless provider stays in the database of that entity. If such organizations participate in the Qiy program, individuals access their accounts with those agencies by logging into their own Qiy account. They gain protection because participating organizations are required to adhere to guidelines established by the nonprofit Qiy Foundation, including mandatory data encryption.
 
Such data centralization projects are likely to be received lukewarmly in the U.S., at least at the moment. Here, two trends conspire against them: stories about data stolen from supposedly secure corporate or government databases have become commonplace; and many consumers show little reluctance to hand over access to their phones’ address books and GPS trackers in exchange for free mobile apps.
 
Still, as the controversy over the U.S. National Security Agency’s data collection practices deepens and data breaches proliferate, greater demand for improved privacy tools is likely to emerge eventually. The New York State Attorney General’s officeissued a report on July 15, for example, saying that in 2013 alone 7.3 million records of New Yorkers were exposed in more than 900 data security breaches, thanks in part to the “retail mega-breaches” at Target and Living Social. Five of the 10 largest data thefts reported to that office have occurred since 2011.
 
The N.Y. Attorney General’s office recommends that, to protect themselves and their clients, companies and other organizations should cut down the amount of data they themselves collect and store. This advice is exactly the kind that could pave the way for new types of privacy protection technologies.

Twenty-one dead lab chickens piled up this spring at a government facility before its researchers could pinpoint why. The team had requested and received what was meant to be a relatively harmless strain of avian flu. Instead, the virus killed all the test birds during experiments. The samples, it turns out, were contaminated with the deadly H5N1 flu strain. Themishap raised concerns about safety procedures at the lab that provided the virus. The lab in question belonged to the Centers for Disease Control and Prevention. The facility that received the virus was operated by the U.S. Department of Agriculture. Even more concerning: the dead chickens event is just one in a string of worrisome recent incidents at government labs that work with potentially deadly microbes. Just yesterday federal officials revealed that 327 vials holding an array of decades-old pathogens were discovered in the same unsecured storage room where six vials of smallpox were discovered earlier this month on the National Institutes of Health campus in Bethesda, Md.

No one died or even got sick as a result of the H5N1 and other mistakes but “that does not change the fact that these were unacceptable events,” CDC Director Tom Frieden told Congress earlier that day. The microbes involved in other CDC incidents include anthrax, botulism and brucellosis. The CDC’s influenza and anthrax labs are shuttered at least for now, and the CDC has promised a far-reaching check on its 22 biosafety level (BSL-) 3 and BSL-4 laboratories before any future samples are shipped in or out. Such facilities handle serious or lethal disease and require special safeguards to ensure microbes do not escape or sicken employees. The biosafety levels increase with the sophistication of security measures taken and the relative danger of pathogens studied. (Read: Bio-Unsafety Level 3: Could the Next Lab Accident Result in a Pandemic?)

The larger question for infectious disease labs and officials now is how these events will alter high-level biosafety work at the more than approximately 1,500 top-level U.S.-funded and private labs across the country going forward. On July 11, when Frieden revealed the extent of recent incidents in a new report and public remarks, he said that his agency wants to reduce the number of laboratories that work with dangerous agents, the number of people who have access to those labs and the number of dangerous pathogens studied. There is no agreement yet on just how many labs would be affected.

In fact, there is no record of how many high-level BSL-designated labs even exist, Rep. Henry Waxman (D–Calif.) said at the July 16 hearing of the House Energy and Commerce Subcommittee on Oversight and Investigations. He was referring to BSL-3 and 4 levels, which deal with biological substances that can cause serious or fatal illness when inhaled.
 
Around the country the number of BSL-3 labs has increased rapidly in the past decade to nearly 1,500 in the U.S., according to 2013 estimates from the Government Accountability Office (pdf). CDC, itself, says it has 21 BSL-3 labs. (The agency also has one BSL-4 laboratory.) Other such facilities are located at university, state, local and private labs. “You have to walk a fine line between having too many labs that puts the likelihood of having accidents high versus too few that would mean research would go at a slow place,” says Amesh Adalja, an infectious diseases doctor and biosecurity expert at the University of Pittsburgh Medical Center. “We are working on very hard problems with these pathogens and not everything will come from one scientific center. There is an advantage to having a variety of centers, and that’s the nature of research.”

Whatever the total count of biosafety labs is, a better framework for their oversight is needed. “There is a continued lack of national standards for designing, constructing, commissioning and overseeing” these labs, testified Nancy Kingsbury, managing director of applied research and methods at the Government Accountability Office. One entity should be charged with coming up with such a plan, she said. The process to obtain the higher BSL designations, much like a security background check, is onerous. If a laboratory loses its BSL approval, it must reapply to regain it. Yet, no single agency has oversight over all these “high-containment” biolabs and there are no national standards for operation.

Meanwhile, some labs are taking matters into their own hands. One state lab in New York, where suspicious materials can be sent for analysis in the event of a bioterror attack, is considering instituting a voluntary moratorium of its own in response to last week’s news. Jill Taylor, director of the Wadsworth Center at the New York State Department of Health, says that her facility, which includes BSL-2 and BSL-3 labs, normally receives requests to send biological materials to universities, the CDC and other facilities. So, while CDC sorts out its next steps and federal labs do some soul-searching, perhaps her lab will, too, she says. “I think we will probably put a moratorium on our shipping of things,” Taylor says. “I want to see if the federal government makes changes to the regulations for shipping of reagents.” The facility would still send requested materials in the event of an emergency. The center’s research arm, however, which works with pathogens including tuberculosis, will not run aground from the CDC moratorium, Taylor says, because it can grow its own samples in the lab.
 
The federal incidents will have an impact on other labs in terms of lessons learned. “It’s important to do a freezer review once a year, at least,” says Scott Becker, executive director of the Association of Public Health Laboratories, with a nod toward the recent discovery of long-forgotten smallpox vials at the National Institutes of Health complex. “I think it’s important to review your protocols and procedures for your techniques regularly and have a discussion inside your institution to just ask the question: Could this happen here?”
 
Back in Washington, D.C., House lawmakers at the subcommittee hearing blasted the CDC for its recent incidents. Rep. Tim Murphy (R–Pa.) said the anthrax incident raises “very serious questions” about CDC’s ability to protect the public. A culture of complacency puts the health of the American public at risk, he said. “It’s sloppy and inexcusable.” Colorado Rep. Diana DeGette (D) echoed the sentiments. The incidents reveal, “there’s a fundamental problem with the culture of identifying and reporting safety problems up the chain of command,” she said.
 
Lawmakers asked Frieden if the issues were a result of researchers becoming complacent about working with such dangerous substances. “You get inured to that danger,” when you work with such materials for years, Frieden said, but he was not yet ready to attribute the pattern of problems completely to that reality.
 
An inspection of CDC’s recent anthrax incident by the Agriculture Department’s Animal and Plant Health Inspections Service raised further questions about the recent anthrax lapse. The service’s report found that some exposed staff were not examined for five days following notification of the exposure, and anthrax was left in an unlocked refrigerator with the key in the lock. Moreover, workers freely passed through the area. Inspectors had to track down missing tubes and plates of anthrax. Some of the Ziploc bags used to transfer materials were also deemed not adequately “durable.” The CDC moratorium, which hinges on a thorough review of its high-level laboratories, will continue “as long as it takes,” said Frieden, declining to offer a time line. “This is not a small thing,” he said. Laboratories with vital public health roles, such as those working with Ebola or drug-resistant tuberculosis, “will reopen quickly” he added.
 
The extent to which new restrictions will be placed on high-level laboratories is still an open question. “What I worry about is there are important research question at some of these labs that won’t get answered if it becomes too difficult to do research on some of these questions,” says Adalja, the expert at the University of Pittsburgh. “We don’t have antivirals or vaccines for SARS (severe acute respiratory syndrome). We don’t have a vaccine for effective treatment for Ebola.”

Of all the things to beleaking methane on Staten Island in New York City—corroded gas pipes, sewers, the Fresh Kills dump—who would have suspected the mail truck? But as I circled a Staten Island neighborhood in a specially equipped Google car, it was a parked mail truck that proved to be sending the biggest leak of methane skyward.

This specially outfitted Subaru has methane detection equipment threaded through its front grill and connected to a spectrometry machine in the trunk for near real-time analysis of incoming air samples. Other than that, it’s just another one of an undisclosed number of cars used by Google to take photos along streets that can be seen on its maps in Street View mode. I was tagging along with a driver who was using it as a demonstration of the new methane-detecting partnership between Google and the Environmental Defense Fund. Methane is a potent greenhouse gas, which over decades, traps at least eight times more heat than carbon dioxide, driving global warming even faster.

Steve Hamburg, a forest ecologist turned chief scientist at the environmental group, found such natural gas vehicles to be the biggest surprise of the test runs of this partnership. Clearly, vehicles that run on natural gas can mess up the detection of leaks from underground. During the test drives “we saw a level going down the street day after day,” Hamburg recalled, before we headed out on our methane-detecting expedition to New York City’s fifth borough on a muggy summer day. “That was a bus,” he added, a bus that runs on natural gas and leaks some of it, like the New York City “Clean-Air” buses powered by compressed natural gas.

But New York City also has abundant methane leaks from old pipelines, as proved by the test runs during this pilot phase in Staten Island. The point of this partnership between Google Earth Outreach and EDF is to test whether a better map of methane leaks could be acquired by the Street View fleet, recognizable by the unmistakeable Google Maps paint job and towering pan-optic camera affixed to the roof making the car 7.1 meters tall. The idea is to add another tool for utilities to use in determining what repairs to undertake first, given limited budgets, by delivering an estimate of exactly how much gas is escaping from a given leak.

The test runs to date involved three Street View cars driving methane routes in three locations: Boston, Indianapolis and Staten Island. The results of those drives—roughly 15 million individual readings—have been released in the form of maps that show thousands of leaks. Boston and Staten Island both averaged one leak for every mile driven, thanks to older infrastructure. On the other hand, the Street View car drove roughly 200 miles for every leak detected in Indianapolis, where pipe upgrades have paid off. Even better, Indianapolis had no major leaks; Boston and Staten Island both had several red dots on their maps, indicating places where methane was leaking at a rate of more than 60,000 liters per day.

Although such relatively small methane emissions are not an immediate safety concern, they do have an outsized impact on climate change. Methane is less prevalent in the atmosphere than CO2, but traps more heat over its relatively short time of a few decades in the atmosphere before it breaks down into yet more CO2 that then traps yet more heat over its time in the atmosphere that can stretch for centuries or millennia. “We want to minimize losses,” Hamburg said. “We’re losing product, increasing climate change, and increasing air pollution.”

Historically, data like this was only known to utilities, or when it became significant enough to pose a safety danger and could be detected by sensitive equipment such as the human nose. “This is the democratization of environmental data,” Hamburg added. The data is not exactly surprising in its entirety. Indianapolis, which replaced its natural gas pipelines over the last several decades, has few leaks compared with a city like Boston with older infrastructure. More than 40 percent of Boston’s natural gas pipes are cast iron or uncoated steel pipes that corrode more easily, and more than half the pipes are 50 years old or more.

Leaks are most common in areas that predate the widespread use of natural gas. For example, my neighborhood in Brooklyn—Gowanus—has become a Superfund site thanks in part to facilities there in the 19th century that turned coal into so-called town gas. The town gas was used to fuel gas-fired lamps, but then that infrastructure was, in some cases, taken over to deliver natural gas to homes. “These pipes were put in the ground for a different type of gas, a different moisture content,” Hamburg explained. “It’s not surprising they need to be replaced by modern plastic pipes.”

Staten Island was chosen because of the challenge of detecting leaks in the presence of other sources of methane, like the Fresh Kills landfill, where feasting microbes turn decomposing garbage into methane. The normal atmosphere of Brooklyn and Staten Island seemed to hover around 2 parts-per-million of the methane molecule in the atmosphere, except when crossing the Verrazano Bridge over New York harbor, where readings drop even lower. Driving down the highway that cuts through Fresh Kills, methane readings spiked as high as 4.6 ppm, because of the methane seeping out from within the landfill. Yet driving the perimeter of the methane recovery plant on the outskirts of those sprawling midden mounds, readings never went above that background level of around 2 ppm. “I’m impressed,” Hamburg said.

Landfills, sewers and even cattle, when the wind is right, can cause spikes that can obscure leaks. Such spikes usually registered around 10 ppm but drivers saw spikes as high as 30 or even 50 ppm. To compensate for such other sources—including more and more ubiquitous natural gas-fueled vehicles thanks to the glut of cheap gas—the maps err on the side of being conservative. “When in doubt, don’t put it down,” Hamburg said. “We’d rather have a false negative than a false positive” so that there is no unnecessary expense in undertaking repairs. And many of these findings from drives in 2013 are likely already out of date as utilities constantly repair and maintain their natural gas infrastructure.

Driving slowly through city streets and neighborhoods, the Google Street View car is always an object of curiosity. On our test run, people stopped to gawk, snapped photos or waved. We traveled just about as slowly as the car from a local driving school, presumably without a professional driver behind the wheel. Though the cameras were turned off during this proof-of-concept phase, the idea is to one day use the Street View fleet—Google declined to specify how many cars are in that fleet or allow their driver to be quoted—to map city streets and methane emissions at the same time. “This is the first time using Street View cars for an environmental project,” said Karin Tuxen-Bettman, program manager for Google Earth Outreach who helped lead this partnership. “Environmental air quality affects everyone and we like those big problems, to see how Google can play a part in solving those.”

The same equipment can also be used to tackle other forms of air pollution, including the soot responsible for asthma and other lung ailments. “Methane is just the first one,” Tuxen-Bettman said, a point reaffirmed by EDF’s Hamburg.

This inaugural effort is part of EDF’s campaign, launched in 2012, to better understand the benefits and dangers of natural gas. It involves 90 institutions and corporations as well as more than 100 scientists, and includes research such as the best ways to measure concentrations in the atmosphere as well as detect leaks in natural gas infrastructure, from the original well drilled to the final user. The preliminary research suggests that methane leaks could be cut at an additional cost of roughly one cent per thousand cubic feet of natural gas produced and moved.

On my drive, the mail truck parked by the side of a suburban road in Staten Island delivers the largest spike I see—4.7 ppm, just surpassing New York City’s largest landfill from the highway. That spike disappears when the mail truck moves on. “That’s why it’s so important to drive roads at least twice,” Hamburg noted. “So even do a third pass to see if it’s infrastructure or a vehicle.”

The trick now is to drive all that road, sniffing out the worst leaks at a clip of 130 to 240 kilometers per day, mostly in circles. Thousands of kilometers of cast iron pipes are underneath the roads of New Jersey alone, just across the water from here. And all those roads are just waiting for someone to drive them, sniffing out leaking methane, delivering cleaner air and combating climate change. Or as Hamburg asked me: “When was the last time you did a joy ride in New York City?”

There is a mantra in the fund-raising world: big donors like to support big ideas. And ideas do not come much larger than at CERN, Europe’s particle-physics laboratory near Geneva in Switzerland. Now the organization — which uses its particle smasher to probe the fundamental structure of the Universe — has registered a charitable foundation to raise funds for its educational, technology-transfer and arts activities.

CERN is not the only big institution to go after donations to fund projects that fall outside the core research remit. The trend is on the rise among large European research organizations. The European Molecular Biology Laboratory (EMBL) in Heidelberg, Germany, is shifting its fund-raising focus from industry sponsorship to private donations. And ITER, the international nuclear-fusion experiment being built in Cadarache, France, is devising a way to deal with the offers of donations that it already receives. What nobody yet knows is the fruit these efforts will bear — whether individuals really want to donate heftily to scientific charities that are not focused on medical solutions.

For CERN, there is no better time to form a charitable foundation, says Matteo Castoldi, head of its development office. CERN’s Large Hadron Collider, and the discovery of the Higgs boson, has “captured the public imagination” as much as the Apollo missions did in the 1960s, he says. The organization is already taking advantage of this, “but there is much more we could do, and that’s where the foundation comes in”.

Registered in Switzerland last month, the CERN & Society foundation is designed to put CERN’s fund-raising efforts on a firmer footing: although the lab has accepted donations in the past, charitable status means that donors can now pledge tax-deductible gifts. Organizers hope that this will encourage more — and larger — donations.

CERN director-general Rolf-Dieter Heuer stresses that such funding will not replace the institute’s core budget, paid for by member states. Instead, the proceeds are aimed at activities that this funding cannot stretch to: school projects, the development of medical spin-offs such as proton therapy (the use of proton beams to kill cancer cells), and meeting the huge demand for general-interest and science-related visits. But if a donor has an explicit desire for their gift to go towards research, CERN would consider this, adds Heuer.

Continental Europe has been slow to embrace professional fund-raising and a wider culture of philanthropy: it lags behind the United Kingdom by about 20 years and the United States by 50 years, says Johannes Ruzicka, managing director of the fund-raising consultancy Brakeley in Munich, Germany, which is advising EMBL. More research institutions are thinking about this kind of fund-raising, he adds, but few actually make the leap, owing to the significant investment and administrative hassle that goes with setting up a foundation.

European universities have been bolder than research institutions, and have been trying to emulate the fund-raising abilities of their US counterparts for some time, says Kate Hunter, executive director of Europe’s arm of the Council for Advancement and Support of Education, based in London. “There’s been a massive trend over the last decade to reinforce that universities are charitable entities in their own right, and that they are a legitimate cause to support,” she says. “So I do think it’s an interesting development if pure science research institutions see that they can do that too.”

There are reasons for the reticence of institutions. Some facilities that are funded by several countries fear that raising large amounts of money through philanthropy could encourage governments to cut their contributions, says Ruzicka. But others believe that state-funding agreements have a finite lifespan and so need to be backed up by other funding sources, he adds.

ITER’s goal — to build an experimental fusion reactor that will serve as a stepping stone towards harnessing effectively limitless energy — already makes it an attractive candidate for philanthropists. The facility is creating its charity framework directly in response to people asking to contribute, says an ITER spokeswoman. The cash will go towards educational activities, internships, exhibitions and conference travel costs, although ITER is not yet authorized to accept tax-deductible donations.

CERN has no history of professional fund-raising, and Castoldi acknowledges that progress will be slow. It also remains to be seen to what extent particle physics will appeal to philanthropists. Hunter is optimistic. “Places like CERN and other research institutions are doing amazing things that will ultimately deliver public benefit, so if those organizations can make that case, that can be quite attractive for donors,” she says.

Heuer says that CERN is “completely open” to offers of any size, and Castoldi hopes that it will raise 25 million Swiss francs ($28 million) in the next five years. Individuals, trusts and companies can donate, and contributors will be recognized in various ways.

Those who make substantial gifts could even have a facility at CERN named after them, says Heuer — but not, he adds, any particles the laboratory might discover. “That is science,” he says. “We don’t touch that.”

This article is reproduced with permission and was first published on July 15, 2014.

A woman peers through goggles embedded in a large black helmet. Forest sounds emanate from various corners of the room: a bird chirping here, a breeze whispering there. She moves slowly around the room. On the wall, a flat digital forest is projected so observers can get a rough idea of her surroundings, but in her mind’s eye, this undergrad is no longer pacing a small, cramped room in a university lab. Thanks to that black helmet, she’s walking through the woods.

In a minute, she’s handed a joystick that looks and vibrates like a chainsaw, and she’s asked to cut down a tree. As she completes the task, she feels the same sort of resistance she might feel if she were cutting down a real tree. When she leaves this forest, and re-enters the “real” world, her paper consumption will drop by 20 percent and she will show a measurable preference for recycled paper products. Those effects will continue into the next few weeks and researchers hypothesize it will be a fairly permanent shift. By comparison, students who watch a video about deforestation or read an article on the subject will show heightened awareness of paper waste through that day — but they will return to their baseline behavior by the end of the week.

The tree-cutting study is one of many that Stanford University has conducted in its Virtual Human Interaction Lab over the last several years in an attempt to figure out the extent to which a simulated experience can affect behavior. And it’s part of a growing body of research that suggests virtual experiences may offer a powerful catalyst for otherwise apathetic groups to begin caring about issues and taking action, including on climate change. That’s important because while time spent in nature has been proven to be quite beneficial to human health, whether or not humans repay the favor tends to rely on the type of nature experiences they have in their youth. In a 2009 study published in the journal PLoS ONE, researchers from the University of Pretoria in South Africa found that while people who spent time hiking and backpacking were more willing to support conservation efforts a decade or more later, those who had visited national parks or spent time fishing as kids were actually less inclined to do anything to support the environment. An earlier (2006) study on the relationship between nature experiences and environmentalism found that while those who had spent their youth in “wild” nature, defined as hiking or playing in the woods, were more likely to be environmentalists as adults, those who had been exposed to “domesticated” nature—defined as visits to parks, picking flowers, planting seeds, or tending to gardens—were not. Given the unlikelihood of every child having a “wild” nature experience, researchers are on the hunt for other ways to cultivate environmentally responsible behavior.

The latest work with virtual reality builds upon roughly half a century of behavioral studies that indicate humans’ willingness to shift behavior is directly correlated to our sense of control.

Climate change, like many large-scale environmental issues, is a problem over which few people feel they have a direct impact—for better or worse. As researchers Sun Joo (Grace) Ahn and Jeremy Bailenson wrote in a forthcoming paper in the journal Computers and Human Behavior, individual actions taken at a micro-scale, like failing to recycle paper or support certain policies, can contribute over time to negative environmental consequences, like deforestation, which in turn affects climate trends over many years. But the long time frames and vast scale create a dangerous disconnect. While 97 percent of peer-reviewed scientific research points to human activities as a primary contributor to climate change, only half of Americans see the link.

Proponents of virtual reality think it could help drive home the impacts of climate change and make people feel empowered to do something about it. “When individuals feel that their behaviors directly influence the well-being of the environment, they are more likely to be concerned about and actively care for the environment,” Ahn and Bailenson wrote.

Bailenson, a cognitive psychologist and founding director of Stanford’s Virtual Human Interaction Lab, sees particular value in virtual reality related to climate change because it allows for a combination of real experience with boundless possibilities: The brain treats the virtual experience as real but, at the same time, knows that anything is possible in the simulation.

“One can viscerally experience disparate futures and get firsthand experience about the consequences of human behavior,” Bailenson said.

Teacher Tech
Researchers working on both virtual and augmented reality—in which mobile apps on either smartphones or tablets overlay information on reality—are increasingly experimenting with these technologies as learning tools. Multiple universities, including Stanford, Harvard, and MIT, are piloting the use of these augmented and virtual reality in middle and high schools. And museums, which enjoy more flexibility, operating outside the realm of curricula requirements and test scores, have wholeheartedly embraced the idea. Science museums and zoos on both coasts are using the technology in exhibits and deploying augmented reality apps that visitors can use on their phones or on museum-issue mobile devices to learn more about what they’re seeing.

“Understanding complicated issues like climate change requires a shift in perspective in terms of how you’re willing to see the problem,” said Amy Kamarainen, co-director of Harvard’s EcoMOBILE and EcoMUVE projects. “We’re trying to do that by immersing kids in environments that have elements similar to real-world systems but are somewhat simplified to meet kids where they are. We put them in complex worlds but give them the tools to be able to unpack what’s happening.”

EcoMUVE, a multi-user, desktop computer-based virtual environment that features a simulated pond ecosystem, was developed by Harvard University to teach students basic biological processes like photosynthesis and decomposition as well as systems thinking about complex environmental issues. The Harvard team recently launched EcoMOBILE, a corresponding augmented reality app, which enables students to take the EcoMUVE experience with them, collect data out in the field, and “see” what’s going on below the surface and what happened in an ecosystem in the past. EcoMUVE was initially piloted in schools in Massachusetts and New York, but is now available for download by any school, and is being used across the United States and in other countries as well, including India and Mexico. EcoMOBIL is currently being piloted at schools in Massachusetts and New York.

A handful of Massachusetts high schools have also piloted an MIT-developed augmented reality app called Time Lapse 2100, which requires users to set various policies that would affect the environment and then shows them what would happen if those policies were enacted. This fall, Bay Area schools will be pilot-testing Stanford’s Coral Reef, a virtual reality game in which participants become a piece of coral in a reef affected by ocean acidification. All three universities are also working with museums and science learning centers to deploy their technology in learning experiences.

“I was initially not sold on the idea of augmented reality,” said cognitive scientist Tina Grotzer, a professor in Harvard’s graduate school of education and the co-principal investigator for both the EcoMUVE and EcoMobile projects. Grotzer spent several years as a teacher herself before heading to Harvard to research how kids learn, particularly how they learn science. Grotzer said it was the technology’s potential to drive home environmental science lessons that won her over. “With physics, you can do an experiment, and kids can see instantly what you’re talking about. With environmental science, we tried to do a decomposition experiment, but you set the experiment up and then 12 weeks later something happens. By then the kids have completely lost interest.”

That’s because it’s difficult for kids to grasp anything that they cannot immediately see, Grotzer explained. Augmented reality enables teachers to extend that vision, or what scholars call an attentional frame, and make the unseen more tangible. For example, teachers take kids to a nearby pond and use EcoMOBILE to show them how the town dumped garbage there 60 years ago and nearly filled in what is today a pristine, natural pond. The app shows them how plants around the pond are turning sunlight into energy and reveals what microscopic pond life is doing under the water’s surface. It also walks them through the real-world collection of water samples, which it helps them to analyze.

“I’ve tagged along on these field trips and have seen how the technology actually immerses them more in the surroundings, rather than distracting them,” Grotzer said. Students use smartphones to take photographs and notes, documenting what they’re seeing: the clarity of the pond water, the weather, descriptions of their samples, different species of bugs and birds. And they can learn at their own pace too. “On a regular field trip, if a student had a question they’d have to leave that moment that spurred the question and go ask the teacher,” Grotzer said. “The teacher would be facilitating the needs of 30 kids. This way they can find the answer themselves and stay in the moment, stay engaged with what they’re looking at.”

In Stanford’s Coral Reef students embody a tall piece of purple coral off the coast of Italy, near Ischia. Over the course of a 14-minute lesson, they are taken through the experience of being coral in a body of water affected by ocean acidification. At first, the surrounding ocean is filled with an abundance of sea life. Waves around the reef are simulated by floor vibrations and ocean sounds. A lab technician periodically touches the participant with a stick in synchronized motions to coincide with what he sees as a fishing net hitting the reef. Then acidification sets in. Sea life begins to die off all around. The reef begins to lose its color, as does the piece of coral the participant has embodied.

Bailenson and his team have tested the simulation with college students and shown that it resulted in students caring more about what is happening to coral reefs. The team followed those participants over weeks, compared them with a group that had simply watched a video about how ocean acidification affects coral reefs, and found the change in attitude catalyzed by the virtual reality experience lasted longer than any shifts stirred by the video.

Smartphones for All
Whether schools opt for an augmented reality tablet app that leads students around the schoolyard pointing out, say, the the biological process at work in the compost pile, or a landscape-based smartphone app (like EcoMOBILE or Time Lapse 2100) for use on a field trip, or a desktop experience (like EcoMUVE) that can be used in the school’s computer lab they face steep tab for both hardware and software. Hardware for virtual reality simulations remains cost prohibitive for most schools, although costs are coming down: virtual headsets like the Oculus Rift now cost consumers $350. A school could potentially purchase a few headsets for a multiuser virtual reality game that four students could play at a time while the rest of the class engages with an augmented reality component on desktops nearby.

Still, despite an increasing variety of options and declining prices, schools looking to put these technologies to use in the classroom face a number of challenges.

If virtual and augmented reality are to have a measurable impact on how future generations understand and approach climate change, access across all socioeconomic classes will be key. Kamarainen said that in some higher-income school districts students could use their own devices.

In many school districts around the country, however, the majority of students do not have smartphones. Mobile phone company Kajeet has begun to address this issue by offering schools data packages that provide WiFi with school-managed filtering so they can set time limits for usage, enabling kids to take home school-provided tablets for only school-related work.

In the schools where Kamarainen works, Harvard provides smartphones to students for use on field trips and pays for Kajeet’s WiFi and data service (two to three cents per megabyte per device). The Harvard apps work on both smartphones and tablets, so it’s feasible that any of the thousands of U.S. schools that have either purchased or been awarded tablets over the past two years could sign up with Kajeet to enable the use of these apps on and off campus. Industry analysts estimate that U.S. schools will purchase an additional 3.5 million tablets by the end of 2014, and multiple companies, including Intel, AT&T, Fox, and Qualcomm have launched nonprofit initiatives to dole out tablets in schools.

The Principal’s Office
Even if companies like Kajeet succeed in making hardware more affordable for schools, virtual and augmented reality developers still face a long road to see their programs widely adopted in education. Logistical challenges include securing funding for pilot tests, budgeting funds to purchase new technology, training staff, and winning buy-in from parents, teachers, and administrators.

“There are clashes all the time between the reality of what goes on in a classroom and what researchers would like to see happen in a classroom,” said Paul Olson, an outreach specialist at the Games Learning Society, or GLS, at the University of Wisconsin at Madison, who taught seventh grade for more than three decades. He said that a lot of his time these days is spent explaining to researchers what life is like “in the trenches” and encouraging teachers to experiment with GLS games to motivate those students who “really don’t respond to a lecture or a chapter in a book but are all over programming something.”

This is where museums incorporating these technologies might fill some gaps. “A museum has the freedom to step outside the rigid guidelines and requirements that schools are held to,” said Dan Wempa, vice president of external affairs for the New York Hall of Science in Queens, which sees roughly 1,200 students per day on field trips during the school year. The museum’s latest exhibit Connected Worlds, created with input from Kamarainen, will immerse visitors in a digital, interactive world that shows how their actions affect the environment. In one part of the exhibit, visitors add water to the environment and a plant flourishes. In another, they add too much and cause flooding. Taken together, the exhibit puts nature into fast forward to help students see how their individual and communal actions hurt or sustain plant and animal life, clean water, and fresh air.

“Students have a germ of knowing that water is important, but they say ‘I didn’t realize that it’s THAT important, and I didn’t realize that what I do over here affects someone way over there,’” Wempa said.

The PTA
“I’m not keen on my kids being immersed in this type of technology,” said Megy Karydes, a marketing consultant and mother of two (ages 7 and 9) in Chicago. “We very much limit our kids’ electronics exposure because I don’t want them addicted. On the other hand, I realize they need to be aware of what’s going on in the world too. I balance it, but if I had to err on the side of caution, I’d rather we go hiking than have them staring at a screen.”

Karydes’ concerns are common among parents. “There are two ways that parents tend to look at these games,” said Eric Klopfer, who directs MIT’s Scheller Teacher Education Program, developed Time Lapse 2100, and has been researching the use of augmented reality in education since 2009. “One is, ‘Great. My kid is outside, but he still has the phone in his hand,’ and the other is that the mobile device and the game are actually getting their kid outside.”

Kamarainen and Grotzer have also heard parental concerns about technology interrupting kids’ experience of nature, and they have worked hard to design games that they feel complement a relationship with nature rather than detract from it.

The EcoMOBILE pilot has included around 1,000 students so far, and Kamarainen said they consistently talk about how the augmented reality piece helps them to see things going on in their communities that they never paid attention to before. “They say this helps open their eyes about the environment that’s around them,” Kamarainen said. “They’re more aware and conscious of it, and they’re paying closer attention to the natural world.”

Ultimately, proponents say that these games not only complement and improve students’ relationship with nature but also teach them how to think systematically and to see their own roles in harming or improving their world.

“The younger kids say, ‘I get to create a world!’” Wempa said, “and the older kids say, ‘I like this because it felt like I was in control and, as a kid, I’m never in control of anything.’ That carries over. They understand that actions have consequences and that they can affect outcomes.”

This article was produced by Climate Confidential and released for re-use under aCreative Commons Attribution 4.0 International License.

Americans have differing perceptions on which weather events are being triggered by climate change, according to a new study that looked at people’s Google searches over a roughly nine-year period.

Using Google Trends—a public tool that shows how often a specific term is entered into the search engine—researcher Corey Lang found that people from all walks of life scour the Internet for information about climate change and global warming when the weather fluctuates. The timing of their searches, however, is dictated by political affiliations and education levels.

According to the study—published in the journal Climatic Change—Republicans and people living in less-educated areas looked for information about the weather, climate change and global warming during extreme cold or hot spells, while Democrats and those living in well-educated communities searched for those terms when there were changes in average temperatures.

“A very rosy interpretation of this is that while these groups may see different elements of climate change or experience weather differently, there is some trigger that is making them seek more information,” said Lang, an associate professor in the Department of Environmental and Natural Resource Economics at the University of Rhode Island.

Connecting the dots in cyberspace
After collecting information from Google Trends for 205 cities across the United States, Lang analyzed how often and when citizens in each city used the search terms “climate change,” “global warming,” “weather,” “drought” and “flood.”

He then matched monthly statistics for the period January 2004 to May 2013 with local weather station data and the 2008 presidential election results from Dave Leip’s “Atlas of Presidential Elections.” From there, he was able to determine that search activity rose with extreme summer heat, long periods of time with no rainfall and winters with minimal cold spells.

While the aforementioned weather changes are consistent with projected climate change, Lang said he was “surprised” to find out that searches also increased when the weather was not indicative of global warming—such as decreases in average winter and spring temperatures.

“The results could suggest that people link weather anomalies of any kind with climate change or perhaps may involve the engagement of deniers, who experience an unusually cool winter and go online to confirm their skeptical views,” he wrote in the study.

A very telling ‘tool’
Lang’s study, however, is not the first time researchers have turned to Google Trends to analyze society’s perception of climate change.

Earlier this year, researchers William Anderegg from Princeton University and Gregory Goldsmith from Oxford University used Google Trends to determine that the public’s declining interest in climate change was not due to several high-profile climate science controversies (ClimateWire, May 22)

And in May, researchers from the Yale Project on Climate Change Communication and the George Mason University Center for Climate Change Communication analyzed Google searches and found that while the terms “climate change” and “global warming” are often used interchangeably, the latter is both more familiar to and more effective with the American public (ClimateWire, May 28).

“Google Trends is an external tool that is really great to see what’s capturing the world’s attention,” said Roya Soleimani, a communications manager at Google. “We’re sharing the information and letting folks or experts discern and analyze the data as they see fit.”

“One of the key motivations for doing this study was to find out if people are engaging in this issue,” Lang said. “And the results say ‘yes.'”

Reprinted from Climatewire with permission from Environment & Energy Publishing, LLC. www.eenews.net, 202-628-6500

Early detection is key to slowing outbreaks of Ebola, such as the one currently spreading across west Africa that is estimated to have infected almost 1000 people, according to the latest World Health Organization report. A molecular computer could one day simplify analysis of biomedical assays like those used to diagnose Ebola, researchers say. And a new prototype device can display a fluorescent letter in the presence of nucleic acid sequences from the Ebola virus or the closely-related Marburg virus: ‘E’ for Ebola or ‘M’ for Marburg.

One way to identify microbes is a microarray containing strands of nucleic acids complementary to DNA or RNA in different viruses. For these tests, doctors isolate and amplify viral nucleic acid in samples from an infected patient. Nucleic acids in the purified sample bind to those on the array, producing a signal – typically a fluorescent molecule. A pattern of fluorescent spots appears on the array, and a computer then interprets the pattern to identify the virus in the sample.

Molecular logic
Assays using molecular computers, however, could simplify biomedical diagnostics. This approach combines the molecular recognition that occurs on the surface of a microarray and the pattern recognition traditionally done by an electronic computer into a single step. Molecular computers are programmed with sets of DNA, RNA or protein logic gates that interact with multiple molecular inputs to generate one output.

One type of logic gate uses deoxy-ribozymes, or DNAzymes, to convert a DNA input into a fluorescently labeled-DNA output. When an input nucleic acid binds to a single-stranded closed loop on the DNAzyme, it triggers one end of the loop to separate from the stem of the DNAzyme. A strand of DNA, called the substrate, binds to the stem, and the DNAzyme snips the substrate. One product of that cleavage is a short strand of DNA containing a fluorescent dye. The dye now lights up because it is separated from a quencher on the other end of the substrate.

Researchers can design the DNAzyme structure so that inputs deactivate the gate, while other DNAzyme logic gates require combinations of inputs. Scientists link different types of molecular logic gates into circuits that perform calculations based on molecular inputs.

In 2006, Joanne Macdonald, then at Columbia University, US, and her colleagues created a molecular computer that uses 32 DNA molecules and 128 DNAzyme-logic gates to calculate the next move in a game of tic-tac-toe against a human opponent. Another circuit of 12 DNA-based logic gates calculates the square root of a four bit binary number using eight inputs. And molecular computers can also work inside cells. One circuit senses levels of five different microRNAs, and if the levels match those typically found in human cervical cancer cells, the circuit produces a protein that kills the cell.

Legible display
Now, Macdonald, at University of the Sunshine Coast in Australia, and her colleagues wondered if it was possible to link DNAzyme logic gates into circuits that generated a visible display of numbers or letters.

Her team designed logic gates to respond to 15-nucleotide segments of DNA from the genomes of two filoviruses, Ebola and Marburg. The presence of the sequence unique to the Marburg virus triggered one DNAzyme gate to cleave a substrate strand containing a green fluorescent dye.

The four strains of the Ebola virus, however, are so varied that they do not share a 15-nucleotide sequence. That meant that detecting Ebola required a slightly more complicated DNAzyme gate. The researchers engineered a DNAzyme with two loops: one to detect to detect the Marburg sequence, and another to detect a sequence common to all filoviruses, which includes Marburg and all strains of Ebola. This gate is activated only when the filovirus sequence is present, but the Marburg sequence is not; activation cleaves a substrate strand labelled with a pink fluorescent dye.

Next, the researchers used a circuit design program to arrange the two logic gates in each of 15 wells of a 384-well plate. They wanted the arrangement to generate a green ‘M’ in response to the Marburg input. Adding the filovirus sequence without the Marburg sequence would generate a pink ‘E’ in the wells. After incubating the wells overnight with either input sequence, the researchers visualised the corresponding letter output using a UV light box.

Macdonald says her team is now testing the DNAzyme gates with samples of viral genomes instead of synthesised sequences.

In another experiment, the researchers built a molecular computer that generated a seven-segment display like those that create the numbers on digital clocks. They encoded the numbers one through nine using various combinations of four DNA inputs. Then they created a molecular circuit so that appropriate segments of the display lit up when the inputs for a particular number were added to the wells.

This proof-of-principle work hints at the possibility of doing away with computers to read diagnostic assays in the future, says Milan Stojanovic of Columbia University. He was Macdonald’s postdoctoral advisor from 2004–2012, but was not involved with the current work. He’s excited to see a clear readout for a process that would otherwise require decoding a complex pattern on a microarray, and he thinks the idea of simple alphanumeric display could be applied to other types of molecular logic as well.

This article is reproduced with permission from Chemistry World. The article wasfirst published on July 18, 2014.

Paleontologist Michael Habib studies the biomechanics of pterosaurs, the biggest of which—at 550 pounds and with a 34-foot wingspan—were the size of modern-day fighter jets. They were the largest flying animals ever to exist and sported anatomy different from any bird or bat. This makes them a unique model for flight mechanics, particularly for large aircraft.

To model how pterosaurs flew, Habib combines principles of physics and vertebrate anatomy with fossil data. He hopes that this knowledge will suggest new aircraft designs and other technology to places like nasa and the dod—it already has in some cases. In an abstract sense, he has brought these animals back from the dead. Pterosaur-inspired applications follow.

Flying Robots over Mars
Traditional spacecraft would need to fly extremely fast to stay aloft in Mars’s thin atmosphere, an impracticality if scientists want to survey Martian terrain in detail. One solution may be a robot that flies like a pterosaur—with swift beating wings and a relatively slow-moving body. Hummingbirds and bumblebees also fly this way, and NASA has created designs for robots based on the biomechanics of these “flapping fliers.”

Morphing Wings
In each wing, pterosaurs had a single tapered finger that grew up to 2.5 meters long in the largest species. When pterosaurs flew, those fingers bent with the force of the downward wing stroke and then reflexively snapped back into position on the upward stroke. The spontaneous return to equilibrium saved pterosaurs significant energy when flapping. Habib says roboticists in the U.S. Air Force are interested in morphing wings, which they could use in flight systems in aircraft or in parachutes—essentially highly convex wings.

Rapid-Launch Systems
Unlike planes today, giant pterosaurs did not need runways. They were experts at vertical takeoff, a feat that is impossible or incredibly inefficient for today’s aircraft. Because the reptiles had stiff but lightweight, hollow bones, they could use all four limbs—both their feet and wings—to push powerfully against the ground. That action allowed them to generate more speed over a shorter distance as they leaped into flight. Habib is currently negotiating a Defense Advanced Research Projects Agency grant proposal with the DOD to design an aircraft system with analogous physical characteristics and a quadrupedal launch strategy that would allow pilots to perform a quick vertical launch or takeoff on low fuel.

Low-Flutter Tents
To fly, pterosaurs kept their wings uniformly taut. Those wings were membranous, with long, thick fibers crisscrossed by smaller fibers that controlled how much the wings fluttered. The fibers individually moved under high air pressure, but their varied dimensions meant they oscillated at opposing frequencies that ultimately canceled out, enabling pterosaurs to maintain a steady wing. Habib has approached manufacturers with a tent fabric design that exploits the same physical principle to reduce noisy flapping and improve stability in high wind conditions.

Divining whether your business is getting a decent return on investment (ROI) from its digital marketing is now easier thanks to start-up Keyword Aspects.

Keyword Aspects positions itself in stark contrast to the widespread lack of transparency and dubious ROI promises of many online marketing farms, as chronicled in news outlets such as the Wall Street Journal, says Co-Founder Matt Cramer, a longtime IT specialist and web developer with years of experience at companies like Match.com.

“Whereas many companies charge exorbitant fees for results that are at times questionable and usually difficult or impossible to verify, Keyword Aspects is “affordable for any business budget and promises only the information companies need to make intelligent decisions for themselves,” says Cramer.

The founders say that with their tools you can see how advertising is working for you with easy-to-understand data reports that track the success of your marketing decisions over time. With Bing and Yahoo capabilities in the works as well, Keyword Aspects is the tool small businesses and large enterprises alike need to take control of their marketing.