Memristor-Driven Analog Compute Engine Would Use Chaos to Compute Efficiently

Memristor-Driven Analog Compute Engine Would Use Chaos to Compute Efficiently

With Mott memristors, a system could solve intractable problems using little power

When you’re really harried, you probably feel like your head is brimful of chaos. You’re pretty close. Neuroscientists say your brain operates in a regime termed the “edge of chaos,” and it’s actually a good thing. It’s a state that allows for fast, efficient analog computation of the kind that can solve problems that grow vastly more difficult as they become bigger in size.

The trouble is, if you’re trying to replicate that kind of chaotic computation with electronics, you need an element that both acts chaotically—how and when you want it to—and could scale up to form a big system.

“No one had been able to show chaotic dynamics in a single scalable electronic device,” says Suhas Kumar, a researcher at Hewlett Packard Labs, in Palo Alto, Calif. Until now, that is.

He, John Paul Strachan, and R. Stanley Williams recently reported in the journal Nature that a particular configuration of a certain type of memristor contains that seed of controlled chaos. What’s more, when they simulated wiring these up into a type of circuit called a Hopfield neural network, the circuit was capable of solving a ridiculously difficult problem—1,000 instances of the traveling salesman problem—at a rate of 10 trillion operations per second per watt.

(It’s not an apples-to-apples comparison, but the world’s most powerful supercomputer as of June 2017 managed 93,015 trillion floating point operations per second but consumed 15 megawatts doing it. So about 6 billion operations per second per watt.)

The device in question is called a Mott memristor. Memristors generally are devices that hold a memory, in the form of resistance, of the current that has flowed through them. The most familiar type is called resistive RAM (or ReRAM or RRAM, depending on who’s asking). Mott memristors have an added ability in that they can also reflect a temperature-driven change in resistance.

The HP Labs team made their memristor from an 8-nanometer-thick layer of niobium dioxide (NbO2) sandwiched between two layers of titanium nitride. The bottom titanium nitride layer was in the form of a 70-nanometer wide pillar. “We showed that this type of memristor can generate chaotic and nonchaotic signals,” says Williams, who invented the memristor based on theory by Leon Chua.

What’s basically happening is that by controlling voltage and current, the device can be put into a state where tiny, random thermal fluctuations in the few nanometers of NbO2 are amplified enough to alter the way the memristor reacts. Williams and his colleagues note that these fluctuations are only big enough to affect things in memristors of this scale. They never saw it in larger devices.

Once they’d characterized what the memristor was doing and how it was doing it, they simulated it in a circuit to see what it could do. In the simulation, they integrated an array of Mott memristors with another, more common type made of titanium oxide to form a Hopfield network. These networks are particularly good at solving optimization problems. That is, problems where you’re trying to discover the best solution from a number of possibilities.

(The traveling salesman problem is one of these. In it, the salesman must find the shortest route that lets him visit all of his customers’ cities, without going through any of them twice. It’s a difficult problem because it becomes exponentially more difficult to solve with each city you add.)

You can imagine the solutions to these problems as valleys in a landscape. The best solution is the lowest point in the landscape, and a computer’s efforts to find it are like a ball rolling down the hills. The problem is that the ball can get stuck in a valley that is low (a solution) but not the lowest one (the optimal solution). The advantage of the Mott memristor network is that the chaotic behavior is enough to basically bump the ball out of the less-than-optimal solutions so it can find the best solution.

“In our case, we’re using chaotic noise to hop out of these barriers,” says HP Labs researcher scientist Strachan.

Williams envisions these “analog compute engines” one day embedded in systems-on-a-chip to accelerate optimization problems. But there are plenty of steps before that. Among the first is to build the system and investigate how well it scales. They’ll also need to properly benchmark its performance against the best algorithms and hardware.

For Williams, there’s a bigger lesson in the development of these memristors. “Everyone’s trying to reinvent the transistor using a new material,” he notes. “Even if you made a perfect transistor—whatever that is—you’d still not beat scaled CMOS.” Instead scientists and engineers should be looking for new types of computing from these new materials. “It’s important to ask what the material system is doing that’s different than what a transistor does… Rather than make a bad transistor, see if it makes something that would take 100 or 1,000 transistors to replicate.” Williams and his team are hoping their memristor system does just that.


Why Solar Microgrids May Fall Short in Replacing the Caribbean’s Devastated Power Systems

Why Solar Microgrids May Fall Short in Replacing the Caribbean’s Devastated Power Systems

After the destruction inflicted across the Caribbean by hurricanes Harvey, Irma, and Maria, renewable energy advocates are calling for a rethink of the region’s devastated power systems. Rather than simply rebuilding grids that delivered mostly diesel generation via damage-prone overhead power lines, renewables advocates argue that the island grids should leapfrog into the future by interconnecting hundreds or thousands of self-sufficient solar microgrids.

“Puerto Rico will lead the way for the new generation of clean energy infrastructure. The world will follow,” asserted John Berger, CEO for Houston-based solar developer Sunnova Energy in a tweet before meeting in San Juan with Puerto Rico Governor Ricardo Rosselló this week. Rosselló appears to be on board, inviting Elon Musk via tweet to use Puerto Rico as a “flagship project” to “show the world the power and scalability” of Tesla’s technologies, which include photovoltaic (PV) rooftops and Powerwall battery systems.

Some power system experts, however, say the solar-plus-batteries vision may be oversold. They say that the pressing need to restore power, plus equipment costs and other practical considerations, call for sustained reliance on centralized grids and fossil fuels in the Caribbean. “They need to recover from the storm. Unfortunately I think the quickest way to do that is to go back to how things were before,” says Brad Rockwell, power supply manager for the Kauaʻi Island Utility Cooperative that operates one of the most renewable-heavy grids in the U.S.

Now is a tough time for a debate, given the ongoing power and communications blackouts afflicting many Caribbean islands, including Puerto Rico, the U.S. and British Virgin Islands, Dominica, and St. Martin. As of Thursday 12 October—more than three weeks after Maria’s cyclonic wrecking ball crossed the region—over four-fifths of customers in Puerto Rico and the U.S. Virgin Islands remained without power, according to U.S. Department of Energy status reports.

Puerto Rico lost major transmission lines that dispatched electricity generated at oil, coal, and natural gas-fired power plants on its lightly populated South shore to all corners of the territory. Its outage level actually slipped from 88.3 to 89.4 percent earlier this week after a tie line went down near San Juan. But it bounced back slightly, to an estimated 83 percent outage level, by yesterday.

What is clear is that several firms are trying to move fast while they talk, equipping rooftop solar systems with battery storage that enables consumers to operate independently of stricken grids. For example:

  • German storage system manufacturer sonnen launched a PV-plus-battery collaboration with local Aguadilla-based solar developer Pura Energía early this month;
  • Sunnova is crafting storage options for roughly 10,000 customers in Puerto Rico that it has already equipped with PV systems;
  • Tesla says it is sending “hundreds” of its Powerwall battery systems to Puerto Rico and, after reports of price gouging by independent installers, plans to dispatch installers from the mainland to expand its local teams.

Peter Asmus, a microgrids analyst with Navigant Research, says that such solar microgrids will deliver power to solar system owners far faster than grid restoration, which is still months away for many customers. He says microgrids will also make the island systems more resilient in the long run.

Asmus sees the situation as reminiscent of post-war Europe, when devastated European grids left a vacuum that enabled something better. “They built a more advanced grid than we have in the U.S.,” says Asmus. He says the Caribbean has a similar opportunity today: “The infrastructure was devastated so severely. They can start over with a cleaner slate.”

Some suppliers see microgrids actually supplanting some of the region’s largest transmission lines. “The grid in Puerto Rico will never be built back the way it used to be,” wrote John Merritt, applications engineering director for Austin, Texas-based Ideal Power in an email to IEEE Spectrum. Ideal Power’s multi-port power converters enable microgrids to efficiently swap power between their alternating current and direct current components, including PV systems, generators, and storage batteries.

Giving up big transmission lines sounds optimistic to Rockwell at the Kauaʻi Island Utility Cooperative (KIUC). It would, he says, represent a major system overhaul and thus lost time that Puerto Rico’s residents and economy can ill afford. “The people of Puerto Rico are not going to want to withstand any more delays than they have to while people figure out how to rebuild in a different way,” he says.

Rockwell adds that batteries are still a rather costly way to balance variable renewable generation. He speaks from experience. KIUC’s grid is over four-fifths solar-powered during some midday hours. Several utility-scale storage systems help integrate such a high degree of variable power by quickly covering for lost PV generation when clouds pass overhead or by absorbing surplus midday generation and discharging it after the sun sets. But Rockwell says high battery costs mean KIUC still relies heavily on its diesel power plants.

Merritt at Ideal Power acknowledges that the same is true for microgrids. Integrating solar can cut an island microgrid’s fuel consumption by 60 to 70 percent, slashing operating costs and pollution, but he says diesel generators remain “important” assets. “Moving a site from 24/7 diesel-powered microgrid to a 24/7 solar + storage microgrid would be cost prohibitive in most cases,” says Merritt.

There are also questions about PVs’ hardiness. Harvey, Irma, and Maria left many PV systems in shambles. Merritt says that a microgrid for a commercial facility on Saint Croix that Ideal Power participated in assembling before the storms is operating without its six 33-kilowatt solar arrays. While they are out of commission for the next few months, the microgrid is relying solely on its diesel generators, battery, and converters.

Some utility-scale solar plants also took a beating, especially Puerto Rico’s solar array at Humacao. PV panels shattered and flew out of their frames when Maria’s Category-4 winds ripped over the Humacao solar plant, where its French owner Reden Energie was in the process of doubling capacity from 26 to 52 megawatts.

Houston-based microgrid developer Enchanted Rock advocates rugged microgrids supported by natural gas, which is cheaper and cleaner than diesel and may be more reliable than both diesel and solar during heavy weather. “You can build community-type microgrids that have some combination of natural gas generation, solar and storage,” says Enchanted Rock CEO Thomas McAndrew.

Enchanted Rock made a name for itself during Hurricane Harvey when its natural gas-powered microgrids at Houston-area grocery stores and a truck stop turned into hubs for first responders and weary residents. Diesel deliveries were hard to come by for 4-5 days, says McAndrew, but natural gas kept flowing underground throughout the storm.

At present few Caribbean islands have access to natural gas, and even Puerto Rico’s gas infrastructure is limited to one liquefied natural gas (LNG) import terminal that pipes the fuel to two power plants. Before Irma and Maria struck Rosselló had been working to expand LNG imports so more of its oil-fired power plants could burn gas.

Enchanted Rock’s McAndrew favors a network to distribute the gas more widely, which he says would be much cheaper than putting power lines underground to protect them from weather. He acknowledges that his proposal is ambitious, but says the outside investors that Puerto Rico will need to attract to support its revival can insist on infrastructure that will survive future storms. As McAndrew puts it: “Whether it’s private or government money, there’s got to be some sense that we might want to do this differently so we don’t just end up rebuilding it every couple of years.”

Why it’s time to lay the stereotype of the ‘teen brain’ to rest

Why it’s time to lay the stereotype of the ‘teen brain’ to rest

A deficit in the development of the teenage brain has been blamed for teens’ behavior in recent years, but it may be time to lay the stereotype of the wild teenage brain to rest. Brain deficits don’t make teens do risky things; lack of experience and a drive to explore the world are the real factors.

As director of research at a public policy center that studies adolescent risk-taking, I study teenage brains and teenage behavior. Recently, my colleagues and I reviewed years of scientific literature about adolescent brain development and risky behavior.

We found that much of the risk behavior attributed to adolescents is not the result of an out-of-control brain. As it turns out, the evidence supports an alternative interpretation: Risky behavior is a normal part of development and reflects a biologically driven need for exploration – a process aimed at acquiring experience and preparing teens for the complex decisions they will need to make as adults.

We often characterize adolescents as impulsive, reckless and emotionally unstable. We used to attribute this behavior to “raging hormones.” More recently, it’s been popular in some scientific circles to explain adolescent behavior as the result of an imbalance in the development of the brain.

According to this theory, the prefrontal cortex, the center of the brain’s cognitive-control system, matures more slowly than the limbic system, which governs desires and appetites including drives for food and sex. This creates an imbalance in the adolescent brain that leads to even more impulsive and risky behavior than seen in children – or so the theory goes.

This idea has gained currency to the point where it’s become common to refer to the “teenage brain” as the source of the injuries and other maladies that arise during adolescence.

In my view, the most striking failure of the teen brain hypothesis is its conflating of important differences between different kinds of risky behavior, only a fraction of which support the notion of the impulsive, unbridled adolescent.

Adolescents as explorers

What clearly peaks in adolescence is an interest in exploration and novelty seeking. Adolescents are by necessity engaged in exploring essential questions about themselves – who they are, what skills they have and who among their peers is worth socializing with.

But these explorations are not necessarily conducted impulsively. Rising levels of dopamine in the brain during adolescence appear to drive an increased attraction to novel and exciting experiences. Yet this “sensation seeking” behavior is also accompanied by increasing levels of cognitive control that peak at the same age as adolescents’ drive for exploration. This ability to exert cognitive control peaks well before structural brain maturation, which peaks at about age 25.

Researchers who attribute this exploratory behavior to recklessness are more likely falling prey to stereotypes about adolescents than assessing what actually motivates their behavior.

If adolescents were truly reckless, they should show a tendency toward risk-taking even when the risks of bad outcomes are known. But they don’t. In experiments where the probabilities of their risks are known, adolescents take fewer risks than children.

In experiments that mimic the well-known marshmallow test, in which waiting for a bigger reward is a sign of self-control, adolescents are less impulsive than children and only slightly more so than adults. While these forms of decision-making may place adolescents at a somewhat greater risk of adverse outcomes than adults, the change in this form of self control from mid-adolescence to adulthood is rather small and individual differences are great.

There is a specific kind of risk-taking that resembles the imbalance that the brain-development theory points to. It is a form of impulsivity that is insensitive to risk due to acting without thinking. In this form of impulsivity, the excitement of impulsive urges overshadows the potential to learn from bad experience. For example, persons with this form of impulsivity have trouble controlling their use of drugs, something that others learn to do when they have unpleasant experiences after using a drug. Youth with this characteristic often display this tendency early in childhood, and it can become heightened during adolescence. These teens do in fact run a much greater risk of injury and other adverse outcomes.

But it is important to realize that this is characteristic of only a subset of youth with weak ability to control their behavior. Although the rise in injurious and other risky behavior among teens is cause for concern, this represents much more of a rise in the incidence of this behavior than of its prevalence. In other words, while this risky behavior occurs more frequently among teens than children, it is by no means common. The majority of adolescents do not die in car crashes, become victims of homicide or suicide, experience major depression, become addicted to drugs or contract sexually transmitted infections.

Furthermore, the risks of these outcomes among a small segment of adolescents are often evident much earlier, as children, when impulse control problems start to appear.

The importance of wisdom

Considerable research suggests that adolescence and young adulthood is a heightened period of learning that enables a young person to gain the experience needed to cope with life’s challenges. This learning, colloquially known as wisdom, continues to grow well into adulthood. The irony is that most late adolescents and young adults are more able to control their behavior than many older adults, resulting in what some have called the wisdom paradox. Older adults must rely on the store of wisdom they have built to cope with life challenges because their cognitive skills begin to decline as early as the third decade of life.

A dispassionate review of existing research suggests that what adolescents lack is not so much the ability to control their behavior, but the wisdom that adults gain through experience. This takes time and, without it, adolescents and young adults who are still exploring will make mistakes. But these are honest mistakes, so to speak, because for most teens, they do not result from a lack of control.

This realization is not so new, but it serves to place the recent neuroscience of brain development in perspective. It is because adolescents are immature in regard to experience that makes them vulnerable to mishaps. And for those with weak cognitive control, the risks are even greater. But we should not let stereotypes of this immaturity color our interpretation of what they are doing. Teenagers are just learning to be adults, and this inevitably involves a certain degree of risk.

Hewlett-Packard historical archives destroyed in Santa Rosa fires

Hewlett-Packard historical archives destroyed in Santa Rosa fires

When deadly flames incinerated hundreds of homes in Santa Rosa’s Fountaingrove neighborhood earlier this month, they also destroyed irreplaceable papers and correspondence held nearby and once belonging to the founders of Silicon Valley’s first technology company, Hewlett-Packard.

The Tubbs fire consumed the collected archives of William Hewlett and David Packard, the tech pioneers who in 1938 formed an electronics company in a Palo Alto garage with $538 in cash.

More than 100 boxes of the two men’s writings, correspondence, speeches and other items were contained in one of two modular buildings that burned to the ground at the Fountaingrove headquarters of Keysight Technologies. Keysight, the world’s largest electronics measurement company, traces its roots to HP and acquired the archives in 2014 when its business was split from Agilent Technologies — itself an HP spinoff.

The Hewlett and Packard collections had been appraised in 2005 at nearly $2 million and were part of a wider company archive valued at $3.3 million. However, those acquainted with the archives and the pioneering company’s impact on the technology world said the losses can’t be represented by a dollar figure.

“A huge piece of American business history is gone,” said Brad Whitworth, who had been an HP international affairs manager with oversight of the archives three decades ago. He said Hewlett-Packard had been at the forefront of an industry “that has radically changed our world.”

Karen Lewis, the former HP staff archivist who first assembled the collections, called it irresponsible to put them in a building without proper protection. Both Hewlett-Packard and Agilent earlier had housed the archives within special vaults inside permanent facilities, complete with foam fire retardant and other safeguards, she said.

“This could easily have been prevented, and it’s a huge loss,” Lewis said.

Keysight Technologies spokesman Jeff Weber acknowledged the destruction of the Hewlett and Packard collections, but he disputed the idea that the company had failed to adequately safeguard them.

“Keysight took appropriate and responsible steps to protect the company archives, but the most destructive firestorm in state history prevented efforts to protect portions of the collection,” Weber said in an email. “This is a sad, unfortunate situation — like many others in Sonoma County now. This is a time to begin healing, not assigning blame.”

He added the company “is saddened by the loss of documents that remind us of our visionary founders, rich history and lineage to the original Silicon Valley startup.”

The flames that entered the Keysight campus on Oct. 9 were part of several wildfires that killed at least 23 residents and destroyed 6,800 homes and other buildings in the county.

Among the structures consumed were two beige, flat-roof modular buildings near the Keysight entrance on Fountaingrove Parkway. The buildings, connected by an overhang to a permanent structure, held not only the archives but also a branch office of First Tech Federal Credit Union.

The rest of Keysight’s campus survived with relatively minimal damage from the fire, CEO Ron Nersesian said on Oct. 10. The campus includes four permanent buildings and a recycling storage facility, together constituting nearly a million square feet of office and production space.

The fire and its aftermath have kept the Fountaingrove facility closed for three weeks.

The campus is undergoing disaster recovery work and may reopen for business this week with a limited number of Keysight’s 1,300 Santa Rosa employees, Weber said.

Meanwhile, about 100 staff members have shifted to former HP facilities inside Rohnert Park’s Somo Village. That location could be outfitted for up to 900 staff members by early November, Weber said. Another 200 staff are now reporting to a facility in Petaluma.

After their start in a Palo Alto garage, now a historic landmark dubbed “the Birthplace of Silicon Valley,” Hewlett and Packard found early success with the Walt Disney Company. The latter ordered eight audio oscillators to test speaker systems and other sound devices being used in 12 specially-equipped theaters in 1940 showing the animated film “Fantasia.”

Hewlett Packard and other companies went on to produce testing and measurement devices that remain a largely unheralded part of the tech industry. But analysts and historians said the equipment proved crucial to the development of computers, cellphones and virtually every other device that plugs into a wall or uses a battery.

HP later developed the first hand-held calculator and the first inkjet printer. It also expanded into making personal computers.

Hewlett-Packard’s relationship with Santa Rosa dates back to 1972, when the company first began operations here. The company opened the Fountaingrove campus in 1975.

The Sonoma County operations consistently focused on testing and measurement equipment, even as HP and later Agilent became a major employer, with 5,000 workers here by 2001. But after the dot-com bust, Agilent in 2004 shuttered its Rohnert Park plant and transferred most of its manufacturing overseas.

By the time Hewlett-Packard arrived in Sonoma County, it had been in operation almost four decades and had gained a reputation for a less authoritarian management style aimed at unleashing employee creativity — a collegial approach that came to be known as the “HP Way.”

The company and its founders also affected international affairs. Hewlett-Packard in 1985 became the first technology company to enter into a joint venture in China. And Packard, who temporarily left the company to become deputy secretary of defense in the first Nixon Administration, was known during the Cold War as a advocate of increased trade with Soviet bloc countries in order to foster world peace.

Packard died in 1996 at age 83. Hewlett was 87 when he died in 2001.

A few years before Hewlett-Packard’s 50th anniversary in 1988, Lewis was brought in to build an archive out of the boxes of company photos, writings and other materials.

After reviewing the contents, she said, “I realized, ‘Oh my god, this is the history of Silicon Valley … This is the history of the electronics industry.’”

Raymond Price, a coauthor of “The HP Phenomenon: Innovation and Business Transformation,” said he received limited access to the company archives when researching the 2009 book. But he and coauthor Charles H. House would gladly have delved deeper into the collection of the two founders.

“We would have killed to have had those records and to go through their personal papers,” said Price, a professor emeritus from the University of Illinois. For researchers, he said, the archives contained “such valuable insights into how companies grow.”

“To me it’s just tragic,” he said of the destroyed collections.

In 2005, an appraiser set the collection’s value at $3.3 million and called it one of the most historically significant company archives remaining outside of nonprofit institutions, Lewis said. She recalled that the appraiser had called its primary source material “of the highest possible historical value” for those researching the convergence of technology and business.

Whitworth noted that HP specifically had Lewis oversee the design of a special archive room at the company’s Palo Alto headquarters. The archives, he said, “were a family treasure that was treated that way.”

Lewis said the room was essentially a vault, a receptacle without windows. It was humidity controlled with no ultraviolet light and protected from fire by foam retardant.

The archives, she said, received the same protection when moved to Agilent Technologies facilities and later when stored in a private site owned by a data storage company.

The archives should have gone to Stanford University, where the founders were alumni, Lewis said.

“These records belonged in the public trust,” she said. “They should not have stayed with a private corporation.”

Instead, the archives were transferred in 2014 from a foundation controlled by Agilent to a similar nonprofit of Keysight.

Both Weber and others suggested the archives came to Santa Rosa largely because Keysight considers its electronic measurement work in direct lineage to the original business of Hewlett and Packard. Some of the first HP devices are displayed in a heritage gallery inside one of the permanent buildings on the Keysight campus.

Weber said only part of the total collection of HP materials was held at the Keysight facility.

“A large portion of that collection stayed with HP during the HP/Agilent split in 1999,” Weber said. “Portions of it stayed with Agilent at the Keysight/Agilent split in 2014, and a small portion came to Keysight.”

However, he acknowledged the materials burned included the personal collections of Hewlett and Packard.

Lewis said those amounted to the heart of the archives and were valued by the appraiser together at $1.9 million.

At the Fountaingrove facility, Weber said, “most of the archives were stored on metal shelving in archival quality folders inside damage-resistant archival boxes in a secure building with a sprinkler system.” He called those steps standard practice for archival collections, and later added that they met or exceeded guidelines set by the United Nations and the Library of Congress.

Whitworth said he doesn’t know what led to the Hewlett and Packard collections being stored in the modular buildings. “Regardless,” he said, “it was a mistake.”

Those who care about HP’s history will be awaiting word about what key documents and materials remain elsewhere.

“We can hope that something was salvaged,” Whitworth said. “But we’re going to be missing volumes.”

UPDATE: Dana Lengkeek, a spokesperson for HP Inc, which also traces its roots to Hewlett and Packard, in an email Tuesday said that company also holds certain Hewlett-Packard archives in Atlanta, Ga. Among them are “hundreds of items related to HP’s founders, including many examples of speeches, personal correspondence, writings and other materials.” Also, donated personal papers of William Hewlett are part of a public collection held by Stanford University.

Why Can’t We Have Traffic-Calming “3-D” Crosswalks Like Iceland?

Why Can’t We Have Traffic-Calming “3-D” Crosswalks Like Iceland?

Federal transportation engineering guidelines conflate conformity with safety.

People around the world are fascinated by the 3-D illusion of this painted crosswalk in the small town of Ísafjörður in Iceland. It’s a creative and simple way to get motorists to slow down.

But if you try to make an eye-catching crosswalk design in the United States, the transportation engineering establishment won’t approve. That’s what happened to a group of neighbors in St. Louis who painted their local crosswalks and were told by the city the new markings were a safety hazard.

There’s no good research to support that position, so why do authorities frown at any deviation from standard crosswalk design? The Federal Highway Administration’s guidance outlines the agency’s thinking:

In 2011, the FHWA issued an additional Official Ruling4 that crosswalk art — defined as any freeform design to draw attention to the crosswalk — would degrade the contrast of the white transverse lines against the composition of the pavement beneath it. In deviating from previous Official Rulings on the matter that concluded an increased factor of safety and decreased number of pedestrian deaths were not evident after installation, this 2011 Official Ruling stated that the use of crosswalk art is actually contrary to the goal of increased safety and most likely could be a contributing factor to a false sense of security for both motorists and pedestrians.

Despite the FHWA’s apparent certainty, there is no rigorous empirical evidence that crosswalk art reduces safety for pedestrians.

It would be one thing if the U.S. had an exemplary pedestrian safety record to uphold. Then strict conformity with the “rules” would make good sense. But American streets are dangerous places to walk, and pedestrian fatalities are skyrocketing — rising nearly 50 percent since 2009.

Meanwhile, FHWA is still using discredited studies from 40 years ago to discourage the installation of crosswalks. It’s clear that the flow of car traffic is still prioritized over public safety at the top levels of the American engineering establishment. Instead of overhauling guidelines to reduce the death toll, we get stale guidance that discourages grassroots interventions to make streets safer.

The bottom line: Federal traffic safety officials take conformity with an unsafe system much more seriously than actual safety outcomes.

More recommended reading today: Transportation for America reports that a program that addresses neighborhood public health disparities is under threat in the Trump administration’s heartless budget proposal. And the State Smart Transportation Initiative shares a new tool that can help communities measure walkability.

IBM’s Quest To Design The “New Helvetica”

IBM is no stranger to icons. Over the years, it’s created quite a few: the mainframe computer, the ThinkPad laptop, the Selectric typewriter, the Eye-Bee-M logo. The company hopes its new bespoke typeface IBM Plex, which launched in beta this week (though the official version won’t be released until early 2018), could become just as iconic–a kind of Helvetica for this century.

“When I came to IBM, it was a big discussion: Why does IBM not have a bespoke typeface? Why are we still clinging on to Helvetica?” Mike Abbink, the typeface’s designer and IBM’s executive creative director of brand experience and design, says in a video explainer. “The way we speak to people and the conversations we need to have and we’d like to have, is that still the right way to express ourselves? We should really design a typeface that really reflects our belief system and make it relevant to people now. Helvetica is a child of a particular sect of modernist thinking that’s gone today.”

To uncover what the typeface should express, Abbink and his team took a deep dive into IBM’s archives. They were especially interested in the company’s history in the postwar years, when its design-led business strategy first took shape and the legendary practitioner Paul Rand, who defined design as a system of relationships, created its famous eight-bar logo. In Rand’s logo, Abbink and his team saw a contrast between hard edges–the engineered, rational, and mechanical–and curves–the softer more humanistic elements. It’s a reflection of the man-and-machine relationship that runs through the company’s history–a dynamic that is reflected in the final form of IBM Plex. Each of the letters and glyphs has those hard “engineered” edges and soft “humanistic” curves, just like Rand’s logo.

The Plex family includes a sans serif, serif, and monospace versions. The designers also created a rigorous style guide that’s akin to a digital standards manual and includes a type scale, which plays into responsive displays; eight different weights (a nod to how the IBM logo is composed of eight horizontally stacked bars); and usage guidelines, which dive into everything from information hierarchies to color and ragging. All together, it’s easy to see Plex as a gentler, friendlier, more casual Helvetica for a broad range of uses both digital and print-based.

Historically, IBM has used design to distinguish itself, whether it’s creating a better typewriter by introducing the “font ball” or defining laptop computing through the “TrackPoint nub.” Now the company is throwing its weight behind its $1 billion artificial intelligence unit Watson and is–in an effort to allay fear about this technology–positioning it as an assistant to humans rather than a replacement for them. A design tool at its core, IBM Plex is an expression of that same intersection between humans and technology. IBM will make the typeface free for anyone to download and is encouraging its widespread adoption. “If shoe stores or coffee shops or small businesses are using it for their identity, awesome,” Abbink says in the video. “They’re agreeing they want to be part of a discussion around machines and how they’re going to evolve and progress our world.”
So far, the response has been mixed: a thread on Hacker News reveals that many commenters agree with IBM’s decision to create its own consistent visual language. “I think this is all about establishing a new distinctive look,” commenter Ged Byrne writes. “The current one screams ‘1990’ at anybody reading. Now they need something that is distinctly IBM while gently whispering ‘2020’ into the reader’s ear.” Others argue the execution isn’t as sharp as they would like. Some don’t agree with the decision never to use true black, some believe the lighter weights won’t work on screens with low resolution, some nitpick on the 75-character-per-line limit.

The typeface is still a work in progress, but the company is sure about what the end result will be, at least–as Abbink proclaims in the video, “IBM Plex is the new Helvetica.”

Three Advances Make Magnetic Tape More Than a Memory

Three Advances Make Magnetic Tape More Than a Memory

In the age of flash memory and DNA-based data storage, magnetic tape sounds like an anachronism. But the workhorse storage technology is racing along. Scientists at IBM Research say they can now store 201 gigabits per square inch on a special “sputtered” tape made by Sony Storage Media Solutions.

The palm-size cartridge, into which IBM scientists squeezed a kilometer-long ribbon of tape, could hold 330 terabytes of data, or roughly 330 million books’ worth. By comparison, the largest solid-state drive, made by Seagate, is twice as big and can store 60 TB, while the largest hard disk can store only 12 TB. IBM’s best commercial tape cartridge, which began shipping this year, holds 15 TB.

IBM’s first tape drive, introduced in 1952, had an areal density of 1,400 bits per square inch and a capacity of approximately 2.3 megabytes.

IBM sees a growing business opportunity in tape storage, particularly for storing data in the cloud, which is called cold storage. Hard disks are reaching the end of their capacity scaling. And though flash might be much zippier, tape is by far the cheapest and most energy-efficient medium for storing large amounts of data you don’t need to access much. Think backups, archives, and recovery, says IBM Research scientist Mark Lantz. “I’m not aware of anything commercial or on the time horizon of the next few years that’s at all competitive with tape,” he says. “Tape has huge potential to keep scaling areal density.”

To store data on tape, an electromagnet called a write transducer magnetizes tiny regions (small crystals called grains) of the tape so that the magnetization field of each region points left or right, to encode bits 1 or 0. Heretofore, IBM has increased tape drive density by shrinking those magnetic grains, as well as the read/write transducers and the distance between the transducers and the tape. “The marginal costs of manufacturing remain about the same, so we reduce cost per gigabyte,” Lantz says.

The staggering new leap in density, however, required the IBM-Sony team to bring together several novel technologies. Here are three key advances that led to the prototype tape system reported in the IEEE Transactions on Magnetics in July.

New Tape Schematics

The surface of conventional tape is painted with a magnetic material. Sony instead used a “sputtering” method to coat the tape with a multilayer magnetic metal film. The sputtered film is thinner and has narrower grains, with magnetization that points up or down relative to the surface. This allows more bits in the same tape area.

Think of each bit as a rectangular magnetic region. On IBM’s latest commercially available tape, each bit measures 1,347 by 50 nanometers. (Hard disk bits are 47 by 13 nm.) In the new demo system, the researchers shrunk the data bits to 103 by 31 nm. The drastically narrower bits allow more than 20 times as many data tracks to fit in the same width of tape.

To accommodate such tiny data elements, the IBM team decreased the width of the tape reader to 48 nm and added a thin layer of a highly magnetized material inside the writer, yielding a stronger, sharper magnetic field. Sony also added an ultrathin lubricant layer on the tape surface because the thinner tape comes in closer contact with the read/write heads, causing more friction.

More Precise Servo Control

Very much like magnetic disks, every tape has long, continuous servo tracks running down its length. These special magnetization patterns, which look like tire tracks, are recorded on the tape during the manufacturing process. Servo tracks help read/write heads maintain precise positioning relative to the tape.

The IBM team made the servo pattern shorter, narrower, and more angled in order to match the smaller magnetic grains of the tape media. They also equipped the system with two new signal-processing algorithms. One compares the signal from the servo pattern with a reference pattern to more accurately measure position. The other measures the difference between the desired track position and the actual position of the read/write head, and then controls an actuator to fix that error.

Together, these advances allow the read/write head to follow a data track to within 6.5-nm accuracy. This happens even as the tape flies by at speeds as high as 4 meters per second (a feat akin to flying an airplane precisely along a yellow line in the road).

Advanced Noise Detection and Error Correction

As the bits get smaller, reading errors go up. “If we squeeze bits closer, the magnetic fields of neighboring bits start to interfere with the ones we’re trying to read,” Lantz says. So the difference between a lower-value 0 signal and a higher-value 1 might be harder to make out. To make up for this, magnetic storage technologies use algorithms that, instead of reading a single bit, take into account signals from a series of bits and decide on the most likely pattern of data that would create the signal. The IBM researchers came up with a new and improved maximum-likelihood sequence-detection algorithm.

They also improved upon the storage technology’s error-correction coding, which is used to slash bit-reading error rates. In the new system, the raw data goes through two decoders. The first looks for errors along rows, while the stronger second one checks columns. The data is run through these decoders twice.

Today’s Internet Optical Illusion Is This Wobbly Floor

Today’s Internet Optical Illusion Is This Wobbly Floor

Good morning, this floor is flat. I know it looks like something off the cover of a Dr. Seuss book, but I am the Thomas Friedman of floors, and I am here to tell you it is flat, and was probably designed by a bunch of sadistic people who would like nothing more than to watch you trip and fall on your face. Actually, it’s the entryway to a tile company’s showroom in Manchester, England. Which, frankly, is a pretty good way to sell tile. But also, the “falling on your face” thing. (On the upside, the illusion only works from one direction. The floor looks normal walking out of the building.)

What it looks like going out of the building.

Extra: This is a picture of a flat carpet.

Popularity of rooftop solar arrays posing challenges for firefighters

Popularity of rooftop solar arrays posing challenges for firefighters

The increasing use of solar power in the region is forcing fire officials to rethink how their departments fight fires involving the technology, especially when the system is on the roof of a burning building.

Coming into contact with live solar arrays and their wires can cause electrical shock or electrocution, according to area fire officials.

And that, along with the added weight solar panels put on a roof, has them concerned about the safety of their crews working a structure fire where the building has a roof-mounted solar array, they say.

Like live wires at a fire scene, photovoltaic systems have to be de-energized for firefighters to safely come into contact with them, fire officials say. If that isn’t possible, crews then have to stay away from panels and the area of roof where the panels are mounted, they said.

Even if the outside electrical connection to a building has been disconnected, a current would still flow between the building and the solar installation, until that system is somehow disconnected, Jeffrey Chickering, Keene’s deputy fire chief, said.

That can have an effect on the methods used to fight a fire, including where hoses are aimed and at what distance, and where holes are cut in a building’s roof to vent flames and smoke, he said.

Other concerns about fighting fires involving photovoltaic installations include the weight solar panels add to a roof, and the placement of those panels, he said.

Any weight added to a roof can increase the likelihood of some level of building collapse during a fire, he said, and firefighters aren’t going to cut through solar panels to vent a roof.

In addition, if the solar array goes right to a roof’s edge, firefighters aren’t going to hook a metal ladder onto it to gain access to the roof, he said.

“It definitely adds a lot more challenges for us,” he said.

One way Keene fire officials have sought to address that challenge is to work with the city’s building department staff to have a record of which buildings in the city have photovoltaic systems, where those systems are installed, and how they can be shut down, he said.

Another solution is training, he said, which the department did in-house in July 2016.

The training, put out by the Clean Energy States Alliance, included sections on recognizing photovoltaic systems and components, understanding systems’ labeling, ways to disconnect systems and tactical approaches.

According to the instruction material, solar thermal systems, which are used to heat water, don’t present the same risk of electrical shock as photovoltaic systems, but they could be a scalding hazard.

Swanzey Fire Chief Norman W. Skantze said the concept of photovoltaic installations affecting firefighting is fairly new and is not part of regular firefighter training, but being handled as a seminar-style class.

At some point, the training should be offered as part of regular firefighter courses with the use of residential and commercial photovoltaic systems becoming more widespread, he said.

“Sooner or later firefighters will encounter the panels in the normal course of their work,” he said. “We don’t want to see any member of the public, or firefighters, injured as a result of new technology.”

His department is evaluating how it can roll into the monthly training curriculum instruction on the potential hazards and strategies for approaching fires involving buildings with photovoltaic arrays, Skantze said.

In the meantime, Swanzey firefighters are being encouraged to participate in a self-paced online course put together by the International Association of Firefighters, he said.

According to the Clean Energy States Alliance training, there are three types of inverters that convert direct-current energy produced by a photovoltaic system into alternating current to match a building’s electrical system and the power grid.

Those inverters stop converting electricity when the utility power shuts down, the training said.

However, that doesn’t stop the photovoltaic system from producing electricity, and that system has to be de-energized manually through disconnection switches, according to the training.

Those disconnects can be inside or outside a house.

Newer systems have what’s called rapid shutdown, which stops the system from producing electricity quickly from one disconnection point, according to the training. Those systems would have been installed after the 2014 edition of the National Electric Code went into effect, mandating rapid shutdown for photovoltaic systems with rooftop solar arrays.

While there are codes to follow when installing photovoltaic systems, how those codes are applied and enforced ultimately depends on the city or town in New Hampshire, Pablo Fleischmann, co-owner of Green Energy Options in Keene, said.

For example, some communities require that solar panels to be installed with a 3-foot setback from the sides of the roof and the ridge, and others don’t, he said.

“What we’re running into here in New Hampshire, it depends on the town,” he said. “Some towns are taking it to the extreme no matter what. Some others don’t have any requirements.”

Solar arrays are made up of modules that are typically anywhere from 25 to 40 volts apiece, he said. It’s when those modules are connected into an array that the voltage increases, he said.

For example, his company installs residential solar arrays up to 600 volts, he said.

Craig J. Bell, general manager of Solar Source in Keene, said that in the electric code, rapid shutdown means interrupting the direct current of a roof-mounted photovoltaic system within 10 feet of the array on the outside of a building, and within 5 feet once the current enter a building.

Basically, the process lowers the voltage level of an array to 30 volts within 10 seconds, he said.

That voltage is considered safe to touch, and safe for firefighters to be up on a roof working around a solar array as long as they’re not cutting wires, he said.

The 2017 version of the code takes that a step further and requires there be an external means to activate a rapid shutdown, he said.

It also requires a distinction between labeling of systems with rapid shutdown depending on whether they were installed under the 2014 or 2017 code, he said.

Another requirement in the 2017 code changes the boundaries of the effects of the rapid shutdown to within 1 foot of a solar array on the outside of a building, and within 3 feet from where the current from the array enters a building, he said.

Further, it requires a rapid shutdown system to lower the voltage outside the 1- or 3-foot boundaries to 30 volts within 30 seconds, he said. Inside the boundaries, the voltage must be lowered to 80 volts within 30 seconds, but that change won’t take effect until Jan. 1, 2019, he said.

The Keene and Swanzey fire departments haven’t be called to structure fires in their towns involving a residential or commercial roof-mounted solar array, but their counterpart in Peterborough has.

Peterborough Fire Chief Ed Walker said the call at a small hydroelectric facility about three years ago. Some type of fault in the solar panels on the building’s roof had caused them to ignite, he said. Firefighters stayed away from the roof and the panels and were able to put out the fire using water sprayed from an aerial ladder, he said.

Solar installations on the roofs or residential buildings aren’t a big problems, as they’re typically only on one side, he said. The roof-mounted commercial systems can be quite problematic because the roofs of those buildings are usually flat with solar panels covering them, he said.

Another factor to consider is that as solar technology advances, the traditional solar panel design is changing, Walker said. For example, solar panels designed mimic shingles are now available, he said.

As photovoltaic systems become more popular, it’s important for firefighters to be aware of the systems, the risks they pose when fighting fires, and how to minimize those risks, he said. Peterborough firefighters have so far done some online training and had discussions about the topic, he said.

“If you don’t see something a lot, you don’t worry about it,” he said. “This is something we’re seeing a lot of now, so we have to worry about it.”

​Microsoft just ended support for Office 2007 and Outlook 2007 | ZDNet

​Microsoft just ended support for Office 2007 and Outlook 2007 | ZDNet

Microsoft is urging customers still on Outlook 2007 and Office 2007 to upgrade as each of the products ran out of extended support on Tuesday.

That means no more security updates, feature updates, support or technical notes for the products, which Microsoft has supported for the past decade.

Microsoft wants customers on Office 2007 to plan to migrate to Office 365 in the cloud or to upgrade to Office 2016.

Office 2007 introduced Microsoft’s “ribbon” interface that brought a series of tabbed toolbars with each ribbon containing related buttons.

For customers that have already use Office 365 that still use Outlook 2007, it will be important to upgrade by the end of October, after which the product won’t allow users to access Exchange Online mailboxes though the Office 365 portal.

“Customers who use Office 365 will have noted that there is a change to the supported client connectivity methods. Outlook Anywhere is being replaced with MAPI/HTTP. Outlook 2007 does not support MAPI/HTTP, and as such will be unable to connect,” Microsoft highlights in a send-off note for the email client.

Come October 31, Microsoft will drop support for the RPC over HTTP protocol, also known as Outlook Anywhere, for accessing mail data from Exchange Online. The new protocol, MAPI over HTTP, is sturdier and supports multi-factor authentication for Office 365, according to Microsoft. Microsoft didn’t backport the protocol to Outlook 2007 as it would be past its extended support date by the time it cut off Outlook Anywhere.

Microsoft has a full list of Office 2007 products and their exact cut off dates here and Outlook 2007 here.

Unlike previous years Microsoft is not offering enterprise customers extended support for Office 2007 through its custom support contracts. The same goes for its other Office products, including Exchange Server; Office Suites; SharePoint Server; Office Communications Server; Lync Server; Skype for Business Server; Project Server and Visio.

Microsoft said demand for custom support has declined with greater adoption of Office 365.