Three Advances Make Magnetic Tape More Than a Memory

Three Advances Make Magnetic Tape More Than a Memory

In the age of flash memory and DNA-based data storage, magnetic tape sounds like an anachronism. But the workhorse storage technology is racing along. Scientists at IBM Research say they can now store 201 gigabits per square inch on a special “sputtered” tape made by Sony Storage Media Solutions.

The palm-size cartridge, into which IBM scientists squeezed a kilometer-long ribbon of tape, could hold 330 terabytes of data, or roughly 330 million books’ worth. By comparison, the largest solid-state drive, made by Seagate, is twice as big and can store 60 TB, while the largest hard disk can store only 12 TB. IBM’s best commercial tape cartridge, which began shipping this year, holds 15 TB.

IBM’s first tape drive, introduced in 1952, had an areal density of 1,400 bits per square inch and a capacity of approximately 2.3 megabytes.

IBM sees a growing business opportunity in tape storage, particularly for storing data in the cloud, which is called cold storage. Hard disks are reaching the end of their capacity scaling. And though flash might be much zippier, tape is by far the cheapest and most energy-efficient medium for storing large amounts of data you don’t need to access much. Think backups, archives, and recovery, says IBM Research scientist Mark Lantz. “I’m not aware of anything commercial or on the time horizon of the next few years that’s at all competitive with tape,” he says. “Tape has huge potential to keep scaling areal density.”

To store data on tape, an electromagnet called a write transducer magnetizes tiny regions (small crystals called grains) of the tape so that the magnetization field of each region points left or right, to encode bits 1 or 0. Heretofore, IBM has increased tape drive density by shrinking those magnetic grains, as well as the read/write transducers and the distance between the transducers and the tape. “The marginal costs of manufacturing remain about the same, so we reduce cost per gigabyte,” Lantz says.

The staggering new leap in density, however, required the IBM-Sony team to bring together several novel technologies. Here are three key advances that led to the prototype tape system reported in the IEEE Transactions on Magnetics in July.

New Tape Schematics

The surface of conventional tape is painted with a magnetic material. Sony instead used a “sputtering” method to coat the tape with a multilayer magnetic metal film. The sputtered film is thinner and has narrower grains, with magnetization that points up or down relative to the surface. This allows more bits in the same tape area.

Think of each bit as a rectangular magnetic region. On IBM’s latest commercially available tape, each bit measures 1,347 by 50 nanometers. (Hard disk bits are 47 by 13 nm.) In the new demo system, the researchers shrunk the data bits to 103 by 31 nm. The drastically narrower bits allow more than 20 times as many data tracks to fit in the same width of tape.

To accommodate such tiny data elements, the IBM team decreased the width of the tape reader to 48 nm and added a thin layer of a highly magnetized material inside the writer, yielding a stronger, sharper magnetic field. Sony also added an ultrathin lubricant layer on the tape surface because the thinner tape comes in closer contact with the read/write heads, causing more friction.

More Precise Servo Control

Very much like magnetic disks, every tape has long, continuous servo tracks running down its length. These special magnetization patterns, which look like tire tracks, are recorded on the tape during the manufacturing process. Servo tracks help read/write heads maintain precise positioning relative to the tape.

The IBM team made the servo pattern shorter, narrower, and more angled in order to match the smaller magnetic grains of the tape media. They also equipped the system with two new signal-processing algorithms. One compares the signal from the servo pattern with a reference pattern to more accurately measure position. The other measures the difference between the desired track position and the actual position of the read/write head, and then controls an actuator to fix that error.

Together, these advances allow the read/write head to follow a data track to within 6.5-nm accuracy. This happens even as the tape flies by at speeds as high as 4 meters per second (a feat akin to flying an airplane precisely along a yellow line in the road).

Advanced Noise Detection and Error Correction

As the bits get smaller, reading errors go up. “If we squeeze bits closer, the magnetic fields of neighboring bits start to interfere with the ones we’re trying to read,” Lantz says. So the difference between a lower-value 0 signal and a higher-value 1 might be harder to make out. To make up for this, magnetic storage technologies use algorithms that, instead of reading a single bit, take into account signals from a series of bits and decide on the most likely pattern of data that would create the signal. The IBM researchers came up with a new and improved maximum-likelihood sequence-detection algorithm.

They also improved upon the storage technology’s error-correction coding, which is used to slash bit-reading error rates. In the new system, the raw data goes through two decoders. The first looks for errors along rows, while the stronger second one checks columns. The data is run through these decoders twice.


Today’s Internet Optical Illusion Is This Wobbly Floor

Today’s Internet Optical Illusion Is This Wobbly Floor

Good morning, this floor is flat. I know it looks like something off the cover of a Dr. Seuss book, but I am the Thomas Friedman of floors, and I am here to tell you it is flat, and was probably designed by a bunch of sadistic people who would like nothing more than to watch you trip and fall on your face. Actually, it’s the entryway to a tile company’s showroom in Manchester, England. Which, frankly, is a pretty good way to sell tile. But also, the “falling on your face” thing. (On the upside, the illusion only works from one direction. The floor looks normal walking out of the building.)

What it looks like going out of the building.

Extra: This is a picture of a flat carpet.

Popularity of rooftop solar arrays posing challenges for firefighters

Popularity of rooftop solar arrays posing challenges for firefighters

The increasing use of solar power in the region is forcing fire officials to rethink how their departments fight fires involving the technology, especially when the system is on the roof of a burning building.

Coming into contact with live solar arrays and their wires can cause electrical shock or electrocution, according to area fire officials.

And that, along with the added weight solar panels put on a roof, has them concerned about the safety of their crews working a structure fire where the building has a roof-mounted solar array, they say.

Like live wires at a fire scene, photovoltaic systems have to be de-energized for firefighters to safely come into contact with them, fire officials say. If that isn’t possible, crews then have to stay away from panels and the area of roof where the panels are mounted, they said.

Even if the outside electrical connection to a building has been disconnected, a current would still flow between the building and the solar installation, until that system is somehow disconnected, Jeffrey Chickering, Keene’s deputy fire chief, said.

That can have an effect on the methods used to fight a fire, including where hoses are aimed and at what distance, and where holes are cut in a building’s roof to vent flames and smoke, he said.

Other concerns about fighting fires involving photovoltaic installations include the weight solar panels add to a roof, and the placement of those panels, he said.

Any weight added to a roof can increase the likelihood of some level of building collapse during a fire, he said, and firefighters aren’t going to cut through solar panels to vent a roof.

In addition, if the solar array goes right to a roof’s edge, firefighters aren’t going to hook a metal ladder onto it to gain access to the roof, he said.

“It definitely adds a lot more challenges for us,” he said.

One way Keene fire officials have sought to address that challenge is to work with the city’s building department staff to have a record of which buildings in the city have photovoltaic systems, where those systems are installed, and how they can be shut down, he said.

Another solution is training, he said, which the department did in-house in July 2016.

The training, put out by the Clean Energy States Alliance, included sections on recognizing photovoltaic systems and components, understanding systems’ labeling, ways to disconnect systems and tactical approaches.

According to the instruction material, solar thermal systems, which are used to heat water, don’t present the same risk of electrical shock as photovoltaic systems, but they could be a scalding hazard.

Swanzey Fire Chief Norman W. Skantze said the concept of photovoltaic installations affecting firefighting is fairly new and is not part of regular firefighter training, but being handled as a seminar-style class.

At some point, the training should be offered as part of regular firefighter courses with the use of residential and commercial photovoltaic systems becoming more widespread, he said.

“Sooner or later firefighters will encounter the panels in the normal course of their work,” he said. “We don’t want to see any member of the public, or firefighters, injured as a result of new technology.”

His department is evaluating how it can roll into the monthly training curriculum instruction on the potential hazards and strategies for approaching fires involving buildings with photovoltaic arrays, Skantze said.

In the meantime, Swanzey firefighters are being encouraged to participate in a self-paced online course put together by the International Association of Firefighters, he said.

According to the Clean Energy States Alliance training, there are three types of inverters that convert direct-current energy produced by a photovoltaic system into alternating current to match a building’s electrical system and the power grid.

Those inverters stop converting electricity when the utility power shuts down, the training said.

However, that doesn’t stop the photovoltaic system from producing electricity, and that system has to be de-energized manually through disconnection switches, according to the training.

Those disconnects can be inside or outside a house.

Newer systems have what’s called rapid shutdown, which stops the system from producing electricity quickly from one disconnection point, according to the training. Those systems would have been installed after the 2014 edition of the National Electric Code went into effect, mandating rapid shutdown for photovoltaic systems with rooftop solar arrays.

While there are codes to follow when installing photovoltaic systems, how those codes are applied and enforced ultimately depends on the city or town in New Hampshire, Pablo Fleischmann, co-owner of Green Energy Options in Keene, said.

For example, some communities require that solar panels to be installed with a 3-foot setback from the sides of the roof and the ridge, and others don’t, he said.

“What we’re running into here in New Hampshire, it depends on the town,” he said. “Some towns are taking it to the extreme no matter what. Some others don’t have any requirements.”

Solar arrays are made up of modules that are typically anywhere from 25 to 40 volts apiece, he said. It’s when those modules are connected into an array that the voltage increases, he said.

For example, his company installs residential solar arrays up to 600 volts, he said.

Craig J. Bell, general manager of Solar Source in Keene, said that in the electric code, rapid shutdown means interrupting the direct current of a roof-mounted photovoltaic system within 10 feet of the array on the outside of a building, and within 5 feet once the current enter a building.

Basically, the process lowers the voltage level of an array to 30 volts within 10 seconds, he said.

That voltage is considered safe to touch, and safe for firefighters to be up on a roof working around a solar array as long as they’re not cutting wires, he said.

The 2017 version of the code takes that a step further and requires there be an external means to activate a rapid shutdown, he said.

It also requires a distinction between labeling of systems with rapid shutdown depending on whether they were installed under the 2014 or 2017 code, he said.

Another requirement in the 2017 code changes the boundaries of the effects of the rapid shutdown to within 1 foot of a solar array on the outside of a building, and within 3 feet from where the current from the array enters a building, he said.

Further, it requires a rapid shutdown system to lower the voltage outside the 1- or 3-foot boundaries to 30 volts within 30 seconds, he said. Inside the boundaries, the voltage must be lowered to 80 volts within 30 seconds, but that change won’t take effect until Jan. 1, 2019, he said.

The Keene and Swanzey fire departments haven’t be called to structure fires in their towns involving a residential or commercial roof-mounted solar array, but their counterpart in Peterborough has.

Peterborough Fire Chief Ed Walker said the call at a small hydroelectric facility about three years ago. Some type of fault in the solar panels on the building’s roof had caused them to ignite, he said. Firefighters stayed away from the roof and the panels and were able to put out the fire using water sprayed from an aerial ladder, he said.

Solar installations on the roofs or residential buildings aren’t a big problems, as they’re typically only on one side, he said. The roof-mounted commercial systems can be quite problematic because the roofs of those buildings are usually flat with solar panels covering them, he said.

Another factor to consider is that as solar technology advances, the traditional solar panel design is changing, Walker said. For example, solar panels designed mimic shingles are now available, he said.

As photovoltaic systems become more popular, it’s important for firefighters to be aware of the systems, the risks they pose when fighting fires, and how to minimize those risks, he said. Peterborough firefighters have so far done some online training and had discussions about the topic, he said.

“If you don’t see something a lot, you don’t worry about it,” he said. “This is something we’re seeing a lot of now, so we have to worry about it.”

​Microsoft just ended support for Office 2007 and Outlook 2007 | ZDNet

​Microsoft just ended support for Office 2007 and Outlook 2007 | ZDNet

Microsoft is urging customers still on Outlook 2007 and Office 2007 to upgrade as each of the products ran out of extended support on Tuesday.

That means no more security updates, feature updates, support or technical notes for the products, which Microsoft has supported for the past decade.

Microsoft wants customers on Office 2007 to plan to migrate to Office 365 in the cloud or to upgrade to Office 2016.

Office 2007 introduced Microsoft’s “ribbon” interface that brought a series of tabbed toolbars with each ribbon containing related buttons.

For customers that have already use Office 365 that still use Outlook 2007, it will be important to upgrade by the end of October, after which the product won’t allow users to access Exchange Online mailboxes though the Office 365 portal.

“Customers who use Office 365 will have noted that there is a change to the supported client connectivity methods. Outlook Anywhere is being replaced with MAPI/HTTP. Outlook 2007 does not support MAPI/HTTP, and as such will be unable to connect,” Microsoft highlights in a send-off note for the email client.

Come October 31, Microsoft will drop support for the RPC over HTTP protocol, also known as Outlook Anywhere, for accessing mail data from Exchange Online. The new protocol, MAPI over HTTP, is sturdier and supports multi-factor authentication for Office 365, according to Microsoft. Microsoft didn’t backport the protocol to Outlook 2007 as it would be past its extended support date by the time it cut off Outlook Anywhere.

Microsoft has a full list of Office 2007 products and their exact cut off dates here and Outlook 2007 here.

Unlike previous years Microsoft is not offering enterprise customers extended support for Office 2007 through its custom support contracts. The same goes for its other Office products, including Exchange Server; Office Suites; SharePoint Server; Office Communications Server; Lync Server; Skype for Business Server; Project Server and Visio.

Microsoft said demand for custom support has declined with greater adoption of Office 365.

Even teachers now say that academics are not the key to kids’ success

Even teachers now say that academics are not the key to kids’ success

To many, increasing automation and the unprecedented pace of technological changes mean kids need more than just academic skills to succeed. They need confidence and motivation to tackle problems, interpersonal skills to work with others and the resilience to stay on task when things fall apart.

New research from the Sutton Trust, a British foundation focused on social mobility, finds that 88% of young people, 94% of employers, and 97% of teachers say these so-called life skills are as or more important than academic qualifications. Perhaps more surprising: more than half of teachers surveyed—53%—believe these “non-cognitive” or “soft” skills are more important than academic skills to young people’s success.

“It is the ability to show flexibility, creativity, and teamwork that are increasingly becoming just as valuable, if not more valuable, than academic knowledge and technical skills,” said Peter Lampl, founder and chairman of the Sutton Trust.

The teachers’ perspective flies in the face of a decades-long movement from governments in the US and UK toward increased standards and testing. The more educators emphasize test scores, the more teachers feel hamstrung to focus their teaching on preparing for those tests, which crowds out the space to teach a subject they might love, or to underpin the subject with creative and collaborative projects and lessons to help build social and emotional learning, or character.

While testing has an important role to play in education, more research is pointing to the idea that too much testing crowds out real learning. Amanda Spielman, Britain’s chief school inspector, said this week that “the regular taking of test papers does little to increase a child’s ability to comprehend. A much better use of time is to teach and help children to read and read more.”

Meanwhile, teaching “character” is taking hold everywhere from Singapore and China to Colombia and Uganda. And employers are on board. Recent research in the US shows that jobs requiring a combination of strong social and cognitive skills are rising far faster than those based on cognitive ability alone.

Unfortunately, the Sutton Trust research found that, despite a lot of lip service about the importance of life skills, most schools in the UK aren’t doing enough to teach them. The National Foundation for Education Research asked secondary school teachers (kids aged 13-18) across England how many offered programs to build life skills, such as extracurricular activities (sports, drama, debating), or volunteering programs. They also asked kids whether they participated. More than a third of students—37%—don’t take part in any clubs or activities. Nearly half of teachers said their schools provided debating, yet just 2% of young people said they participated.

There is also a huge socioeconomic dimension. Less than half (46%) of students from disadvantaged backgrounds participate in extracurriculars, compared to 66% from better off families.

The group stressed the need for a more holistic approach to children’s education, and also vouchers to help disadvantaged kids participate in extracurricular activities. It noted that private schools have long focused on the importance of building confidence, articulacy (a Britishism for being articulate), and perseverance. And research shows those private school kids dominate the ranks of government and industry.

Even teachers now say that academics are not the key to kids’ success

Long Sleeves on Doctors’ White Coats May Spread Germs

Long Sleeves on Doctors’ White Coats May Spread Germs

SAN DIEGO — Doctors may want to roll up their sleeves before work, literally. A new study suggests that long sleeves on a doctor’s white coat may become contaminated with viruses or other pathogens that could then be transmitted to patients.

In the study, the researchers had 34 health care workers wear either long- or short-sleeved white coats while they examined a mannequin that had been contaminated with DNA from the “cauliflower mosaic virus.” This virus infects plants and is harmless to humans, but it is transmitted in a way that is similar to that of other, harmful pathogens, such as Clostridium difficile, a bacteria that causes severe diarrhea, said Dr. Amrita John, an infectious disease specialist at University Hospitals Case Medical Center in Cleveland, who led the study. John presented the research here on Friday (Oct. 6) at an infectious disease conference called IDWeek 2017.

The health care workers wore gloves while they examined the mannequin, then removed the gloves, washed their hands and put on a new pair of gloves before examining a second, clean (non-contaminated) mannequin. After the health care workers had finished examining both mannequins, the researchers swabbed the workers’ sleeves, wrists and hands, and tested the samples for DNA from the cauliflower mosaic virus. Each of the 34 participants completed the exam twice (once wearing short sleeves and once wearing long sleeves), for a total of 68 “simulations.”

They found that, when the health care workers wore long-sleeved coats, 25 percent of the simulations resulted in contamination of their sleeves or wrists with the virus DNA marker, compared with none when the health care workers wore short-sleeved coats.

In addition, about 5 percent of health care workers who wore long sleeves contaminated the clean mannequin with the virus DNA marker, while none of the health care workers who work short sleeves contaminated the clean mannequin.

These results provide support for a recommendation “that health care personnel wear short sleeves to reduce the risk for pathogen transmission,” John said. [10 Deadly Diseases That Hopped Across Species]

Such a recommendation already exists in the United Kingdom — in 2007, the country’s department of health introduced a “bare below the elbow” policy for hospitals, which recommended that health care personnel wear short sleeves. In the United States in 2014, the Society for Healthcare Epidemiology of America said that health care facilities might consider the adoption of a “bare below the elbow” policy.

Some U.S. facilities have subsequently adopted this policy within their institutions, and the new findings suggest that “more people should consider it,” said study co-author Dr. Curtis J. Donskey, an infectious disease specialist and professor of medicine at Case Western Reserve University in Cleveland.

Still, the policy has been met with some resistance, with some doctors calling for more evidence showing that long sleeves really do increase the likelihood of transmitting pathogens. The new study provides some evidence, but additional, larger studies are still needed before some hospitals may adopt the policy, John said.

In addition, future research is still needed to show that a short-sleeve policy actually reduces the number of infections spread in a hospital, the researchers said.

But John said the study has changed her personal preference for the way she wears her white coat. “I role up my coat sleeves above my elbow,” John said.

Astronomers Are ‘Racing Against Time’ as Humanity Clogs the Air With Radio Signals

Astronomers Are ‘Racing Against Time’ as Humanity Clogs the Air With Radio Signals

In a remote valley in the British Columbia interior, a massive telescope called the Canadian Hydrogen Intensity Mapping Experiment (CHIME) is scouring the skies for traces of dark energy, a mysterious force that drives the expansion of the universe but has never been directly detected. As if hunting for dark energy isn’t challenging enough, radio astronomers fear it might not be long before proliferating tech like smartphones and space satellites make these kinds of studies—and even the ongoing search for aliens—impossible, due to radio interference.

According to Mark Halpern, principal investigator at CHIME and astronomy professor at the University of British Columbia, the growing number of communications satellites in space as well as technologies on the ground that emit radio waves are interfering with CHIME’s data-collecting, and could potentially do more damage in the future. If radio astronomers aren’t able to do their research, it could prevent us from making future discoveries about our universe.

“I feel like we’re racing against time to get CHIME done while we still can,” Halpern said.


Radio frequencies are everywhere. They’re used to transport information for radio broadcasts, television, and your cell phone. As anybody who’s ever awkwardly spoken to someone on a walkie-talkie that accidentally hooked up to the wrong frequency knows, it’s really easy to interrupt those channels when one signal bleeds into the other.

It might not be such a big deal when your television gets a little bit of static. But interference can cause radio astronomers to lose their research data. Radio astronomy has led to the discovery of quasars, the imaging of asteroids, and showed us the cosmic microwave background, which is leftover radiation from the Big Bang. Just this week, scientists discovered a new source of gravitational wave: the violent merger of two neutron stars 130 million light years away. Astronomers will study the resulting radio waves to learn more about the energy of a neutron star collision, and how much mass is ejected.

The International Telecommunications Union (ITU), the United Nations’ agency for policing frequencies, provides recommendations for how radio frequencies should be distributed. The agency sets aside a band of radio waves specifically for radio astronomy projects.

But the nature of CHIME’s experiment makes it so that the telescope has to access a broader range of frequencies outside of that spectrum to map out more parts of the universe at once.

Halpern said the huge amount of frequencies they were accessing wasn’t a problem when they initially started the project. They’re located in a radio safe zone in a valley near Penticton, BC, where there is government-approved signage in the area telling drivers to turn off all electronic devices. But he said that around three years ago, a series of television stations started opening up near Penticton, causing bleeds into their signal.

Although the valley protects CHIME against local radio waves, the television satellites locked in orbit still cause interference. And since the scientists don’t own the frequencies they use—and won’t likely ever afford to buy a spectrum needed to perform their experiments, as they sell for billions—these scientists can’t do much about it. Halpern expects the rest of their radio waves will be auctioned off for television eventually.

“It’s completely not in the cards that CHIME could use any part of its budget to buy its own frequency,” Halpern said. In terms of funding and priority, he said, communications services dwarf the resources of radio astronomers.


Satellites aren’t the only thing bleeding into other frequencies. According to Ken Tapping, an astronomer at the National Research Council’s Dominion Radio Astrophysical Observatory in Penticton, where CHIME is also based, everyday tech like smartphones and those new wireless car keys emit accidental radio waves, called “unwanted emissions,” that interfere with other frequencies.

“These things splatter across all paths of the radio spectrum. They’re produced inconsequentially,” Tapping told me in a phone interview.

The ITU recommends that radio astronomy studies expect a maximum of five percent of their total data lost due to interference. It might not seem like a lot, but for experiments that depend on tracking short radio wave bursts, it could mean losing information that’s crucial to the experiment. Tapping said that in the future, even if everybody sticks to their allotted radio emissions, the amount of interference will increase due to the sheer amount of technology.

“They’re dirt cheap, they’re imported from abroad, and they’re being deployed all over the place in a currently uncontrolled fashion,” Tapping said, referring to the proliferation of cheap electronic devices. “If these reach a certain density of use, then radio astronomy could become rather difficult.”

Tapping hopes that it the ITU will be able to cap its losses at five percent. If a radio astronomy study is losing more of their research than that, he said, it might cause problems with funding, since funding sources could feel as though they’re not getting an adequate return on their investment.

Growing radio interference will also make it harder for scientists to receive signals from extraterrestrial life. At the Search for Extraterrestrial Life Institute (SETI) in Mountain View, California, radio astronomers are constantly looking for any sort of message that didn’t come from humans. But, according to SETI senior astronomer Seth Shostak, the signals they look for are the same ones humans produce every day.

“The question is ‘Is this ET on the line, or is it another telecommunications satellite passing overhead?'” Shostak told me. When SETI detect a promising radio signal, their astronomers check if it moves in correlation with the rotation of the Earth, and make sure that their other receivers don’t detect it, since that would indicate it’s either a human-made satellite in orbit or a ground-based radio system.

The clear solution for radio astronomers is to move their instruments to remote areas with little human presence. This can occasionally result in rules that are somewhat dystopian: China banned any electronic devices and created a resident-free zone around its massive new radio telescope to ensure there would be no interference, relocating 9,000 residents in the process.

Halpern said his team did field measurements in remote places like the Sahara Desert when trying to figure out where to put CHIME. But remote locations come with their own challenges, like ensuring safety and access for scientists, and the extra cost of building in unpopulated areas.

Another option, Shostak suggests, is to move radio astronomy projects to the backside of the Moon, which is shielded from any frequencies from Earth. The obvious problem, he said, is that these projects don’t have the astronomical amount of funding necessary for a lunar mission, so they’re Earthbound for now.

To prevent the death of radio astronomy, Tapping said that astronomers have to work more closely with communications companies for solutions. The introduction of more low-power transmitters for smartphones and other technologies could reduce the amount of unwanted emissions, and increase the battery life of those products, too. But Tapping pointed out that would affect these companies’ bottom line, so the risk of increased interference lives on.

“There’s a Darwinian struggle going on,” Tapping said. “But I’ll be honest, there always has been.”

USA and Japan’s giant robot battle was a slow, brilliant mess

USA and Japan’s giant robot battle was a slow, brilliant mess

The oft-delayed giant robot fight has finally taken place. On Tuesday, Team USA’s mechs scrapped it out with Japan’s Kuratas in an abandoned steel mill for the world to watch. There could only be one victor, and it proved to be the red, white, and blue. Yes, the MegaBots team representing America came out on top, but not before three gruelling rounds of robopocalypse.

Those who tuned into Twitch to view the action saw Team USA’s Iron Glory get knocked down by Japan’s Kuratas bot straight out the gate. Its paintball canon clearly no match for its 13-foot rival’s half-ton fist. In the second round, the MegaBots pilots came back with the newer Eagle Prime machine, itself decked out with a mechanical claw and gattling gun. But, they still struggled to land a deadly blow, instead getting stuck to their foe — with Kuratas’ drone sidekick making life that much harder. Then, in the final round, things got grizzly. Eagle Prime whipped out a chainsaw to dismember Suidobashi Heavy Industry’s juggernaut and end the carnage.

Okay, so Team USA had the unfair advantage of using two bots, and the entire event may have been as choreographed as a WWE match, but it was strangely watchable regardless.

With a win under its belt, the MegaBots team now wants to start a full-blown giant robots sports league. And, there’s at least one contender waiting in the wings.

Pastry Chefs Forced to Get Creative as Vanilla Prices Soar

Pastry Chefs Forced to Get Creative as Vanilla Prices Soar

As Hurricane Harvey barreled toward Texas, Rebecca Masson, owner of Houston’s Fluff Bake Bar, thought about what was most important to her; what she had to keep safe. She ran to her pantry, grabbed the last 10 quarts of vanilla she had, and sped to shelter. At a time when top vanilla producers are charging $600 to $750 per kilogram for vanilla beans, Masson’s stash of vanilla was nothing short of liquid gold. “I could not risk it being flooded or stolen,” she says. “To lose all my vanilla? That would be no joke.”

Bakers and ice cream makers across the country have been crushed by the price surge for vanilla, which spiked after a cyclone hit Madagascar, the world’s leading producer of vanilla, on March 7. The current $600 per kilogram price is up from around $100 in 2015, and near $500 per gallon for pure vanilla extract, which sold for $70 a gallon in 2015.

While price hikes due to weather or a poor harvest are nothing new, the current vanilla crisis is unique. “The increase feels different than any other price hike we have seen because it is both prolonged and dramatic,” says Allison Kave, who co-owns Brooklyn bar and bakery Butter & Scotch.

“I’ve seen hikes before,” Masson says, recalling a 2005 surge when vanilla bean prices doubled. “But six to seven months later, prices went back down.” Not so this time. Instead, prices have showed no signs of softening.

Craig Nielsen, VP of sustainability at Nielsen-Massey, which has been in the business of making vanilla since 1907, says his company does not expect a change in price anytime soon. Vanilla plants take about three years to mature and produce beans. When the cyclone hit this spring, it tore through the main vanilla-growing areas in Madagascar, known as the SAVA region. Not only were crops devastated, but the surrounding trees, essential to filter sunlight and diffuse the heat hitting the vanilla vines, were also decimated. This means future crops may also be damaged or die from the stress of too much sun.

But Nielsen says the price hike is about more than the cyclone. In 2007, vanilla production began to decline in alternate growing regions (regions outside of Madagascar) because prices had fallen so low. It makes sense: farmers were not willing to invest the time and labor to grow and harvest vanilla in that depressed market, and supply started to decline.

Then, in 2015, vanilla prices started to climb as consumers began demanding natural ingredients in their candy bars, ice cream, and cakes. In November 2015, Hershey’s announced that it would swap out the artificial ingredient “vanillin” for the real deal in its kisses and chocolate bars. The move was the first in a series of changes to remove all artificial ingredients from the chocolates. With big food demanding real vanilla, prices started to climb to $150, then $200, then $275 a gallon, according to Masson. Add on a cyclone, and the three- to four-year life cycle of the crop, and prices went through the roof.

Some makers, like Amy Keller of Jane’s Ice Cream in Kingston, New York, were smart enough to stockpile vanilla at the first sign of a price surge a few months ago. But Keller is already worrying about what will happen she runs out, as prices have gone up not only for Madagascar vanilla (which accounts for 75 to 80 percent of world supply) but for vanilla from other sources — Indonesia, Mexico, Uganda, India — because of the heightened demand.

“If I could increase the price of my ice cream at the same percentage as the rising price of vanilla, I’d be doing really well right now — like, really well,” says Peter Arendsen, owner of the wholesale ice cream company Ice Cream Alchemy. Unfortunately, he can’t, so for now, he eats the cost. As does Ample Hills Creamery in Brooklyn, where co-owner Jackie Cuscuna says she will not pass the cost onto her customers, but notes that she has stopped introducing any new flavors made with vanilla.

Others have had to take more severe action. This summer, the organic ice cream company Blue Marble stopped selling its vanilla base to its wholesale customers, instead offering sweet cream or buttermilk flavors. Elsewhere, New York City pastry chef Fany Gerson, who relies on vanilla for her La Newyorkina popsicles and her doughnuts at Dough, took to milking the most out of every pod: She uses the beans once, then soaks them, uses the liquid that results, then dries them and grinds them into a vanilla sugar.

Four months ago, when Eric Berley, who co-owns the Philadelphia ice cream shop Franklin Fountain with his brother Ryan, started paying $544 a gallon for vanilla, he crunched the numbers and estimated that he would have to spend $22,000 more on vanilla this summer than last. To mitigate losses, he painstakingly reviewed every ice cream recipe and held blind taste tests with lower amounts of vanilla. The tweaked recipes have helped somewhat. “We didn’t have to take the full hit,” he says.

The soaring cost of vanilla did force prices up at Butter & Scotch; on August 1, the price of its whole vanilla birthday cake went up to $72, up from $60. “It was a really hard decision, but we’d seen the price increase so dramatically,” says Kave. Kave and co-owner Keavy Blueher have also abandoned offering homemade cream soda (made from whole vanilla beans), and, like Berley, have tweaked recipes to use the least amount of vanilla possible. They’ve even looked into making their own vanilla extract, but after doing the math on pricing on the beans and labor, found it would not make sense.

Imitation product is available, sure, but most bakers worth their weight in frosting won’t touch the stuff. “I don’t use anything artificial or made in a lab,” says Masson, who managed to find a blend of Tahitian and Madagascar vanilla extract at $1.72 an ounce ($500/case) in July. But she’s not sure what she will do when her vanilla runs out: She says the case is already up to $991. “Vanilla is all I think about,” she says. “I dream about it. Because at this rate, I just won’t be able to afford it.”

While bakers and makers are reeling, there may be a silver lining in this story after all. Nielsen points out that previously low price levels were not sustainable for the farmers, because of how labor intensive the crop is to grow, harvest, and produce. “There needed to be an adjustment in price to keep farmers interested in growing and maintaining the vines,” he says.

Nielsen predicts future vanilla prices will undergo a measured, not dramatic, price decline because of the continued strong global demand, tied to a commitment by large food manufacturers to use natural versus artificial flavors. A more moderate price, somewhere around $100 or $150 a gallon, might be the best of both worlds.

Why is our universe three dimensional? Cosmic knots could untangle the mystery

Why is our universe three dimensional? Cosmic knots could untangle the mystery

Next time you’re untangling your earbuds in frustration, here’s an idea to help put it in perspective: knots may have played a crucial part in kickstarting our universe, and without them we wouldn’t live in three dimensions. That’s the strange story pitched by a team of physicists in a new paper, and the idea actually helps plug a few plot holes in the origin story of the universe.

Our universe has three spatial dimensions. That’s such a basic fact of reality that most people don’t ever stop to question why it’s the case. But in theory, three dimensions seems like a somewhat arbitrary number. Why doesn’t our universe have four, or five, or 11 dimensions? The question has plagued physicists, but trying to answer it has all-too-often been relegated to the “too-hard” basket.

After five years of tackling the problem, an international team of physicists has developed a theory that not only explains how the universe arose in its three dimensional state, but also solves several other mysteries of its birth and growth. The key is a fairly common element of the Standard Model of particle physics called a flux tube.

Flux tubes are flexible strands of energy that bind elementary particles together – linking quarks and antiquarks with the help of gluons. But as the particles drift apart, they can eventually break the flux tube between them. That gives off a burst of energy that creates a new quark-antiquark pair, which bind to the existing particles to form two complete pairs.

Flux tubes are a well known phenomenon, but for the new theory the physicists found that by kicking those up to a higher energy level, they can solve some mysteries about why the universe happens to be exactly the way we see it.

In the early days of everything, the universe was just a hot, thick primordial soup called quark-gluon plasma. With so many elementary particles in close proximity, they would have created a whole mess of flux tubes. Most of these tubes would have quickly been destroyed though, since matter and antimatter annihilate each other when they meet, taking the flux tubes with them.

But there are times when flux tubes can survive longer than the particles that they link. If those particles move in just the right way, they can twist their flux tubes into knots, which are stable enough to exist on their own. And if several of these flux tubes intertwine, they can form an even more stable network of knots, which would have quickly filled the early universe.

The team soon realized this idea explained two long-standing issues with the currently-accepted idea of how the universe came to be. The story goes that in its first few moments, the universe underwent a period of extremely rapid expansion – from the size of a single proton to a grapefruit in less than a trillionth of a second. After that, expansion happened much more slowly, although it is currently accelerating.

But two questions about that story have never been properly answered: what triggered that sudden burst of expansion, and then why did it slow back down again? When the team calculated how much energy would be tied up in their knotty network, they realized it gave a convenient explanation for both of those.

“Not only does our flux tube network provide the energy needed to drive inflation, it also explains why it stopped so abruptly,” says Thomas Kephart, co-author of the study. “As the universe began expanding, the flux-tube network began decaying and eventually broke apart, eliminating the energy source that was powering the expansion.”

The story neatly fits in with existing ideas of the origins of everything. After the flux tube network breaks down, it releases particles and radiation into the universe, which then continues to expand and evolve the way other theories have explained.

That also brings us back to the question of why the universe is three dimensional. According to knot theory, knots can only exist in three dimensions: as soon as you add a fourth, they quickly unravel. That means that during the early period, the knotted flux tubes would have only caused rapid expansion in the three spatial dimensions. By the time the flux tube network broke down, the groundwork had already been laid for a 3D universe to evolve, and any higher dimensions that exist would remain tiny and essentially undetectable.

While the theory is certainly intriguing, it’s still a work in progress. Before the idea can be properly proposed, the researchers say they need to develop it further to allow it to make testable predictions about the nature of the universe.

And in the end, maybe tangled earbuds are a small price to pay, considering we might not exist without them.