Stephen Fry’s blasphemy probe dropped after Irish police fail to find ‘enough outraged people’ | The Independent

Stephen Fry’s blasphemy probe dropped after Irish police fail to find ‘enough outraged people’ | The Independent

//players.brightcove.net/624246174001/82f79524-152c-485f-bcb0-09197a216c87_default/index.html?videoId=5424985340001

An Irish police investigation into allegedly blasphemous comments made by Stephen Fry has been dropped after detectives decided there were not enough people who had been outraged by the remarks.

Police launched an investigation into the presenter, author and comedian after he described God as “capricious”, “mean-minded”, “stupid” and an “utter maniac” during an appearance on Irish television show “The Meaning of Life” in February 2015.

The comments were widely reported but did not become a legal matter until a man complained last year, prompting a police enquiry.

Under Irish law, it is illegal to use words that are “grossly abusive or insulting in relation to matters sacred to any religion, thereby intentionally causing outrage among a substantial number of adherents of that religion”.

After initial inquiries, officers decided that not enough people had been outraged by Mr Fry’s remarks to warrant further investigation, according to the Irish Independent.

A source told the paper: “This man was simply a witness and not an injured party. Gardaí (Irish police) were unable to find a substantial number of outraged people.

“For this reason the investigation has been concluded.”

Asked in 2015 by the programme’s host, Gay Byrne, what he would say to God if he arrived in heaven, Mr Fry replied: “I’d say, bone cancer in children? What’s that about?”

“How dare you? How dare you create a world to which there is such misery that is not our fault? It’s not right, it’s utterly, utterly evil.

“Why should I respect a capricious, mean-minded, stupid god who creates a world that is so full of injustice and pain?

“We have to spend our life on our knees thanking him? What kind of god would do that?

“The god who created this universe, if it was created by god, is quite clearly a maniac, an utter maniac, totally selfish.”

Under Ireland’s 2009 Defamation Act, anyone “who publishes or utters blasphemous matter shall be guilty of an offence” and liable for a fine of up to €25,000 (£21,200).

The man who made the initial complaint about Mr Fry is said to have been satisfied that Irish police had investigated the matter fully and told detectives he was merely doing his civic duty in reporting it.

Given there was no one deemed to be harmed by the comments, the case is now said to have been closed.

http://www.independent.co.uk/news/world/europe/stephen-fry-blasphemy-ireland-probe-investigation-dropped-police-gardai-not-enough-outrage-a7725116.html

Forest of the Future Library – Oslo, Norway – Atlas Obscura

Forest of the Future Library – Oslo, Norway – Atlas Obscura

The fate of physical books may seem tenuous, but at least 1,000 copies of 100 different books are set to be printed roughly a century from now, in the year 2114. Indeed, the trees that will be used to make the printing paper have already been planted.

These saplings were planted for the Future Library, a forward-looking art project that’s one part literary time capsule, one part environmental statement. The project, launched in 2014, plans to commission one book from a different author each year for 100 years, none of which will be published until 2114.

The stories will be printed on paper made from the 1,000 trees planted in Nordmarka, a forest just north of Oslo. Until then, the manuscripts will be kept on the top floor of the Deichman Library in Oslo, in a space called the “Silent Room.” They will be on display but not made available to read for generations to come, meaning most of us won’t ever get the chance.

The Silent Room, set to open in 2019, will be made from the wood that was cleared from the Nordmarka forest to make room for the new trees. The room will only be big enough for a few visitors at a time, and will offer a view of the growing forest off in the distance.

Scottish artist Katie Paterson, who conceived of the Future Library project, often uses time and nature for her art. Past projects include mapping dead stars, sending a meteorite back into space, and live broadcasting sounds made by a melting glacier.

Paterson knows she will probably not see the finished product of her century-long project, however she plans to attend the Handover Ceremony as long as she can. At the ceremony, held each spring, the author selected for that year’s text holds a reading in the future forest before delivering the manuscript. The first author to participate was, perhaps fittingly, the popular dystopian fiction author Margaret Atwood. Her 2014 novel, Scribbler Moon, has never been read. The next year was Cloud Atlas author David Mitchell, who contributed a book titled From Me Flows What You Call Time. Next up is Icelandic writer Sjón.

http://www.atlasobscura.com/places/forest-of-the-future-library

PennDOT Road Sign Sculpture Garden – Meadville, Pennsylvania – Atlas Obscura

PennDOT Road Sign Sculpture Garden – Meadville, Pennsylvania – Atlas Obscura

Along the Smock Highway in Meadville, Pennsylvania, in an otherwise drab stretch of strip malls, the north side of the road is lined by a colorful pattern of repurposed road signs that stretches for nearly a quarter mile. Located on the property of the Pennsylvania Department of Transportation building (hence the name “PennDOT”), the PennDOT Road Sign Sculpture Garden is the most perplexing set of driving instructions you’ll ever see.

Alternatively known as “Read Between the Signs,” the road sign sculpture garden was created when art students from the nearby Allegheny College teamed up with DoT employees who had a few extra road signs on their hands. The result was both bizarre and ingenious, portraying the Pennsylvania landscape with a recycled metal homage.

Along the sculpture garden’s 10-foot mural wall, Smock Highway drivers will pass by oceanic waves made of blue hospital signs, a barn constructed from red stop signs, and an adjacent silo made of white junction signs. Turkeys, sombreros, oil wells, kayakers, rainbow hot air balloons, and Ferris Wheels movable by hand are portrayed through creative road sign assortments, and an intricately accurate road sign model of Allegheny College’s Bentley Hall sits toward the wall’s western end. Farmers will enjoy the sculpture garden’s takes on various farm animals, and theme park enthusiasts will be thrilled by a rollercoaster slope made of 45 mph signs.

http://www.atlasobscura.com/places/penndot-road-sign-sculpture-garden

Nilometer – Cairo, Egypt – Atlas Obscura

Nilometer – Cairo, Egypt – Atlas Obscura

In ancient Egypt, the behavior of the Nile could mean life or death each harvest season. So, long before the Aswan Dam was constructed to manage the flooding of the great river, Egyptians invented an instrument to measure the waters in order to predict the Nile’s behavior: the nilometer.

There were three kinds of nilometers, and examples of all three can still be seen around Egypt. The simplest was a tall column housed in a submerged stone structure called a stilling well. One of these nilometers can be seen on Rhoda (or Rawda) Island in Cairo, an octagonal marble column held in place by a wooden beam at the top that spans the width of the well. The stilling well included a staircase so that priests, who were in charge of monitoring the nilometers, could walk down and examine the column.

Nilometers were used for measuring water levels as early as 5,000 years ago. The nilometer on Rhoda Island dates back to 861, when it was built where an older nilometer had been, based on a design by Afraganus, a famous astronomer. The massive measuring stick had markings on it to indicate where the water level was at any given time, information the priests would use to determine what conditions the future held: drought, which would mean famine; desirable, which would mean just enough overflow to leave good soil for farming; or flood, which could be catastrophic.

Only priests and rulers, whether pharaohs or later, Roman or Arab leaders, were allowed to monitor the nilometers, and their ability to predict the behavior of the Nile was used to impress the common people. (And to determine how much money would be collected in taxes.) This is why so many nilometers were built in temples, where only priests would be able to access the mysterious instrument.

The nilometer on Rhoda Island is today housed in a modernized building. The conical roof replaced an older dome that was destroyed in 1825 during the French occupation. The interior is ornately carved, and three tunnels that once let water into the stilling well at different levels have been filled in, so visitors can walk all the way down.

Another type of nilometer, like the one that can be seen on Elephantine Island in Aswan, had evenly spaced steps that lead straight down to the Nile, and indicator markings on the walls at different levels for each step. This one was often the first to indicate what conditions to expect, being located near Egypt’s southern border. The third kind, an example of which can be seen at Temple of Kom Ombo, a little further north, brought the water away from the Nile by way of a canal that deposited it into a cistern. And again, the indicating markers were carved into the wall, accessible by staircases for the priests and rulers who predicted the fate of the Egyptian crop.

http://www.atlasobscura.com/places/nilometer

MP3 is dead, long live AAC

MP3 is dead, long live AAC

MP3, the format that revolutionized the way we consume (and steal) music since the 90s, has been officially retired — in a manner of speaking. The German research institution that created the format, The Fraunhofer Institute for Integrated Circuits, announced that it had terminated licensing for certain MP3-related patents…in other words, they didn’t want to keep it on life support, because there are better ways to store music in the year 2017. Rest now forever, MP3.

In its place, the director of the Fraunhofer Institute told NPR, the Advanced Audio Coding (AAC) format has become the “de facto standard for music download and videos on mobile phones.” It’s simply more efficient and has greater functionality, as streaming TV and radio broadcasting use the format to deliver higher-quality audio at lower bitrates than MP3.

Basic research in audio encoding began at the Friedrich-Alexander University of Erlangen-Nuremberg in the late 1980s. Researchers there and from the Fraunhofer Institute joined forces, and their result was the humble MP3 standard. The format takes up 10 percent of the storage space of the original file, a monumental reduction at the time. According to Stephen Witt’s book How Music Got Free, corporate sabotage and and other failures almost stonewalled the MP3 into irrelevancy. Finally, Fraunhofer just started giving away software consumers could use to rip songs from compact discs to MP3 files on their home computer, after which the format took off.

By the end of the 90s, however, those tiny files were zipping around the nascent internet, spawning a gold rush of digital piracy. It ruled illegal sharing for years as sites like Napster and Kazaa hosted popular peer-to-peer services allowing folks to download songs with a click. Of course, the format also enabled development on the legal side of the aisle as online vendors scrambled to lawfully meet the connected public’s need for digitally-acquired music.

Apple’s iTunes store dominated that market, which funneled music into their answer to the MP3 player market, the iPod. Apple gave users the option of using AAC almost from the start, and that format has proven the eventual successor. But MP3 deserves its place in history for enabling casual users to experience for the first time the internet’s true (if dubiously legal) potential for exchanging data.

https://www.engadget.com/2017/05/12/mp3-is-dead-long-live-aac/

When is Mother’s Day and what is the history of Mother’s Day? — Quartz

When is Mother’s Day and what is the history of Mother’s Day? — Quartz

Unearthing the most complete picture of the origins of Mother’s Day began with a misplaced cardboard box in the kitchen of a West Virginia church.

It was 2003, and Katharine Antolini, then a graduate student researching the history of motherhood, had been asked to speak at the site in Grafton, where the first known celebration of Mother’s Day took place in 1908.

During the church tour, she stumbled upon the box on the floor of the kitchen. “Some of the documents were almost 100 years old,” she told Quartz. “I was like, ‘You can’t leave these here!’”

Antolini offered to archive the papers, and in doing so, became fascinated with what she found: the unlikely story of Anna Jarvis, who first spent years crusading to make Mother’s Day a widely observed holiday—and then spent the rest of her life trying to undo it.

The project inspired Antolini to keep digging, and Jarvis’ story became her PhD dissertation and later a book, Memorializing Motherhood: Anna Jarvis and the Struggle for Control of Mother’s Day.

“I realized how very complicated the history was,” Antolini said. “She ultimately failed, of course. But she never gave up until her death of trying to fight for her day.”

How did Mother’s Day come to be?

When her mother, Ann Reeves, died in 1905, Jarvis organized the first observances of Mother’s Day at the Andrews Methodist Episcopal Church in Grafton, and later in Philadelphia, Pennsylvania.

She chose the second Sunday in May because it was the Sunday closest to her mother’s death. The white carnation, Reeves’ favorite flower, became the holiday’s symbol.

Jarvis, who had no kids of her own, had a singular understanding of what Mother’s Day should mean. She saw it as a day for children to visit their mothers at home and to remember the sacrifices that they had made.

“This is not a celebration of maudlin sentiment. It is one of practical benefit and patriotism, emphasizing the home as the highest inspiration of our individual and national lives,” Jarvis wrote.

She began a letter-writing campaign, reaching out to anyone who she thought could help promulgate the idea. “Any mayor, merchant or minister,” Antolini said. She wrote to Teddy Roosevelt. To Mark Twain. To every state governor, every year.

By 1914, when most states had already recognized the day locally, the US congress passed a law designating Mother’s Day a national holiday. A day later, president Woodrow Wilson issued an official proclamation.

Why did the holiday’s champion crusade against it?

As Mother’s Day grew more popular, Jarvis became distressed by what she saw as other groups co-opting the day she felt a sense of ownership over.

“She battled over the meaning of Mother’s Day,” Antolini said. Jarvis took issue with businesses, like greeting-card companies and florists, that she felt began exploiting “her day” to maximize their own profits.

Jarvis also was angered by the actions of charities like the American War Mothers, that began selling white carnations on Mother’s Day as a fundraising tool. She once was arrested in Philadelphia for showing up unannounced at the group’s convention.

She crashed the event, yelling at members of the nonprofit organization for exploiting Mother’s Day, Antolini said. A newspaper account said Jarvis was initially charged with disturbing the peace, but was later let off by a local magistrate.

“It was very much tied to her ego…that was her life, her fortune—everything was invested in it,” Antolini said. “When people started to take the day and lay claim to it and use it for different purposes, that bothered her.”

Ultimately, the fight—against the greeting-card companies, the candy shops, the nonprofits—would consume her. “She lost everything. It was a movement that economically, emotionally and physically drained her,” until her death in 1948, Antolini said. “She spent the last four years of her life in a sanitarium, blind and penniless.”

The history of Mother’s Day is actually quite dark

NYU Accidentally Exposed Military Code-breaking Computer Project to Entire Internet

NYU Accidentally Exposed Military Code-breaking Computer Project to Entire Internet

IN EARLY DECEMBER 2016, Adam was doing what he’s always doing, somewhere between hobby and profession: looking for things that are on the internet that shouldn’t be. That week, he came across a server inside New York University’s famed Institute for Mathematics and Advanced Supercomputing, headed by the brilliant Chudnovsky brothers, David and Gregory. The server appeared to be an internet-connected backup drive. But instead of being filled with family photos and spreadsheets, this drive held confidential information on an advanced code-breaking machine that had never before been described in public. Dozens of documents spanning hundreds of pages detailed the project, a joint supercomputing initiative administered by NYU, the Department of Defense, and IBM. And they were available for the entire world to download.

The supercomputer described in the trove, “WindsorGreen,” was a system designed to excel at the sort of complex mathematics that underlies encryption, the technology that keeps data private, and almost certainly intended for use by the Defense Department’s signals intelligence wing, the National Security Agency. WindsorGreen was the successor to another password-cracking machine used by the NSA, “WindsorBlue,” which was also documented in the material leaked from NYU and which had been previously described in the Norwegian press thanks to a document provided by National Security Agency whistleblower Edward Snowden. Both systems were intended for use by the Pentagon and a select few other Western governments, including Canada and Norway.

Adam, an American digital security researcher, requested that his real name not be published out of fear of losing his day job. Although he deals constantly with digital carelessness, Adam was nonetheless stunned by what NYU had made available to the world. “The fact that this software, these spec sheets, and all the manuals to go with it were sitting out in the open for anyone to copy is just simply mind blowing,” he said.

He described to The Intercept how easy it would have been for someone to obtain the material, which was marked with warnings like “DISTRIBUTION LIMITED TO U.S. GOVERNMENT AGENCIES ONLY,” “REQUESTS FOR THIS DOCUMENT MUST BE REFERRED TO AND APPROVED BY THE DOD,” and “IBM Confidential.” At the time of his discovery, Adam wrote to me in an email:

All of this leaky data is courtesy of what I can only assume are misconfigurations in the IMAS (Institute for Mathematics and Advanced Supercomputing) department at NYU. Not even a single username or password separates these files from the public internet right now. It’s absolute insanity.

The files were taken down after Adam notified NYU.

Intelligence agencies like the NSA hide code-breaking advances like WindsorGreen because their disclosure might accelerate what has become a cryptographic arms race. Encrypting information on a computer used to be a dark art shared between militaries and mathematicians. But advances in cryptography, and rapidly swelling interest in privacy in the wake of Snowden, have helped make encryption tech an effortless, everyday commodity for consumers. Web connections are increasingly shielded using the HTTPS protocol, end-to-end encryption has come to popular chat platforms like WhatsApp, and secure phone calls can now be enabled simply by downloading some software to your device. The average person viewing their checking account online or chatting on iMessage might not realize the mathematical complexity that’s gone into making eavesdropping impractical.

The spread of encryption is a good thing — unless you’re the one trying to eavesdrop. Spy shops like the NSA can sometimes thwart encryption by going around it, finding flaws in the way programmers build their apps or taking advantage of improperly configured devices. When that fails, they may try and deduce encryption keys through extraordinarily complex math or repeated guessing. This is where specialized systems like WindsorGreen can give the NSA an edge, particularly when the agency’s targets aren’t aware of just how much code-breaking computing power they’re up against.

Adam declined to comment on the specifics of any conversations he might have had with the Department of Defense or IBM. He added that NYU, at the very least, expressed its gratitude to him for notifying it of the leak by mailing him a poster.

While he was trying to figure out who exactly the Windsor files belonged to and just how they’d wound up on a completely naked folder on the internet, Adam called David Chudnovsky, the world-renowned mathematician and IMAS co-director at NYU. Reaching Chudnovsky was a cinch, because his entire email outbox, including correspondence with active members of the U.S. military, was for some reason stored on the NYU drive and made publicly available alongside the Windsor documents. According to Adam, Chudnovsky confirmed his knowledge of and the university’s involvement in the supercomputing project; The Intercept was unable to reach Chudnovsky directly to confirm this. The school’s association is also strongly indicated by the fact that David’s brother Gregory, himself an eminent mathematician and professor at NYU, is listed as an author of a 164-page document from the cache describing the capabilities of WindsorGreen in great detail. Although the brothers clearly have ties to WindsorGreen, there is no indication they were responsible for the leak. Indeed, the identity of the person or persons responsible for putting a box filled with military secrets on the public internet remains utterly unclear.

An NYU spokesperson would not comment on the university’s relationship with the Department of Defense, IBM, or the Windsor programs in general. When The Intercept initially asked about WindsorGreen the spokesperson seemed unfamiliar with the project, saying they were “unable to find anything that meets your description.” This same spokesperson later added that “no NYU or NYU Tandon system was breached,” referring to the Tandon School of Engineering, which houses the IMAS. This statement is something of a non sequitur, since, according to Adam, the files leaked simply by being exposed to the open internet — none of the material was protected by a username, password, or firewall of any kind, so no “breach” would have been necessary. You can’t kick down a wide open door.

The documents, replete with intricate processor diagrams, lengthy mathematical proofs, and other exhaustive technical schematics, are dated from 2005 to 2012, when WindsorGreen appears to have been in development. Some documents are clearly marked as drafts, with notes that they were to be reviewed again in 2013. Project progress estimates suggest the computer wouldn’t have been ready for use until 2014 at the earliest. All of the documents appear to be proprietary to IBM and not classified by any government agency, although some are stamped with the aforementioned warnings restricting distribution to within the U.S. government. According to one WindsorGreen document, work on the project was restricted to American citizens, with some positions requiring a top-secret security clearance — which as Adam explains, makes the NYU hard drive an even greater blunder:

Let’s, just for hypotheticals, say that China found the same exposed NYU lab server that I did and downloaded all the stuff I downloaded. That simple act alone, to a large degree, negates a humongous competitive advantage we thought the U.S. had over other countries when it comes to supercomputing.

The only tool Adam used to find the NYU trove was Shodan.io, a website that’s roughly equivalent to Google for internet-connected, and typically unsecured, computers and appliances around the world, famous for turning up everything from baby monitors to farming equipment. Shodan has plenty of constructive technical uses but also serves as a constant reminder that we really ought to stop plugging things into the internet that have no business being there.

The WindsorGreen documents are mostly inscrutable to anyone without a Ph.D. in a related field, but they make clear that the computer is the successor to WindsorBlue, a next generation of specialized IBM hardware that would excel at cracking encryption, whose known customers are the U.S. government and its partners.

Experts who reviewed the IBM documents said WindsorGreen possesses substantially greater computing power than WindsorBlue, making it particularly adept at compromising encryption and passwords. In an overview of WindsorGreen, the computer is described as a “redesign” centered around an improved version of its processor, known as an “application specific integrated circuit,” or ASIC, a type of chip built to do one task, like mining bitcoin, extremely well, as opposed to being relatively good at accomplishing the wide range of tasks that, say, a typical MacBook would handle. One of the upgrades was to switch the processor to smaller transistors, allowing more circuitry to be crammed into the same area, a change quantified by measuring the reduction in nanometers (nm) between certain chip features. The overview states:

The WindsorGreen ASIC is a second-generation redesign of the WindsorBlue ASIC that moves from 90 nm to 32 nm ASIC technology and incorporates performance enhancements based on our experience with WindsorBlue. We expect to achieve at least twice the performance of the WindsorBlue ASIC with half the area, reduced cost, and an objective of half the power. We also expect our system development cost to be only a small fraction of the WindsorBlue development cost because we carry forward intact much of the WindsorBlue infrastructure.

Çetin Kaya Koç is the director of the Koç Lab at the University of California, Santa Barbara, which conducts cryptographic research. Koç reviewed the Windsor documents and told The Intercept that he has “not seen anything like [WindsorGreen],” and that “it is beyond what is commercially or academically available.” He added that outside of computational biology applications like complex gene sequencing (which it’s probably safe to say the NSA is not involved in), the only other purpose for such a machine would be code-breaking: “Probably no other problem deserves this much attention to design an expensive computer like this.”

Andrew “Bunnie” Huang, a hacker and computer hardware researcher who reviewed the documents at The Intercept’s request, said that WindsorGreen would surpass many of the most powerful code-breaking systems in the world: “My guess is this thing, compared to the TOP500 supercomputers at the time (and probably even today) pretty much wipes the floor with them for anything crypto-related.” Conducting a “cursory inspection of power and performance metrics,” according to Huang, puts WindsorGreen “heads and shoulders above any publicly disclosed capability” on the TOP500, a global ranking of supercomputers. Like all computers that use specialized processors, or ASICs, WindsorGreen appears to be a niche computer that excels at one kind of task but performs miserably at anything else. Still, when it comes to crypto-breaking, Huang believes WindsorGreen would be “many orders of magnitude … ahead of the fastest machines I previously knew of.”

But even with expert analysis, no one beyond those who built the thing can be entirely certain of how exactly an agency like the NSA might use WindsorGreen. To get a better sense of why a spy agency would do business with IBM, and how WindsorGreen might evolve into WindsorOrange (or whatever the next generation may be called), it helps to look at documents provided by Snowden that show how WindsorBlue was viewed in the intelligence community. Internal memos from Government Communications Headquarters, the NSA’s British counterpart, show that the agency was interested in purchasing WindsorBlue as part of its High Performance Computing initiative, which sought to help with a major problem: People around the world were getting too good at keeping unwanted eyes out of their data.

Under the header “what is it, and why,” one 2012 HPC document explains, “Over the past 18 months, the Password Recovery Service has seen rapidly increasing volumes of encrypted traffic … the use of much greater range of encryption techniques by our targets, and improved sophistication of both the techniques themselves and the passwords targets are using (due to improved OPSec awareness).” Accordingly, GCHQ had begun to “investigate the acquisition of WINDSORBLUE … and, subject to project board approval, the procurement of the infrastructure required to host the a [sic] WINDSORBLUE system at Benhall,” where the organization is headquartered.

Among the Windsor documents on the NYU hard drive was an illustration of an IBM computer codenamed “Cyclops,” (above) which appears to be a WindsorBlue/WindsorGreen predecessor. A GCHQ document provided by Snowden (below) describes Cyclops as an “NSA/IBM joint development.”

In April 2014, Norway’s Dagbladet newspaper reported that the Norwegian Intelligence Service had purchased a cryptographic computer system code-named STEELWINTER, based on WindsorBlue, as part of a $100 million overhaul of the agency’s intelligence-processing capabilities. The report was based on a document provided by Snowden:

The document does not say when the computer will be delivered, but in addition to the actual purchase, NIS has entered into a partnership with NSA to develop software for decryption. Some of the most interesting data NIS collects are encrypted, and the extensive processes for decryption require huge amounts of computing power.

Widespread modern encryption methods like RSA, named for the initials of the cryptographers who developed it, rely on the use of hugely complex numbers derived from prime numbers. Speaking very roughly, so long as those original prime numbers remain secret, the integrity of the encoded data will remain safe. But were someone able to factor the hugely complex number — a process identical to the sort of math exercise children are taught to do on a chalkboard, but on a massive scale — they would be able to decode the data on their own. Luckily for those using encryption, the numbers in question are so long that they can only be factored down to their prime numbers with an extremely large amount of computing power. Unluckily for those using encryption, government agencies in the U.S., Norway, and around the globe are keenly interested in computers designed to excel at exactly this purpose.

Given the billions of signals intelligence records collected by Western intelligence agencies every day, enormous computing power is required to sift through this data and crack what can be broken so that it can be further analyzed, whether through the factoring method mentioned above or via what’s known as a “brute force” attack, wherein a computer essentially guesses possible keys at a tremendous rate until one works. The NIS commented only to Dagbladet that the agency “handles large amounts of data and needs a relatively high computing power.” Details about how exactly such “high computing power” is achieved are typically held very close — finding hundreds of pages of documentation on a U.S. military code-breaking box, completely unguarded, is virtually unheard of.

A very important question remains: What exactly could WindsorBlue, and then WindsorGreen, crack? Are modern privacy mainstays like PGP, used to encrypt email, or the ciphers behind encrypted chat apps like Signal under threat? The experts who spoke to The Intercept don’t think there’s any reason to assume the worst.

“As long as you use long keys and recent-generation hashes, you should be OK,” said Huang. “Even if [WindsorGreen] gave a 100x advantage in cracking strength, it’s a pittance compared to the additional strength conferred by going from say, 1024-bit RSA to 4096-bit RSA or going from SHA-1 to SHA-256.”

Translation: Older encryption methods based on shorter strings of numbers, which are easier to factor, would be more vulnerable, but anyone using the strongest contemporary encryption software (which uses much longer numbers) should still be safe and confident in their privacy.

Still, “there are certainly classes of algorithms that got, wildly guessing, about 100x weaker from a brute force standpoint,” according to Huang, so “this computer’s greatest operational benefit would have come from a combination of algorithmic weakness and brute force. For example, SHA-1, which today is well-known to be too weak, but around the time of 2013 when this computer might have come online, it would have been pretty valuable to be able to ‘routinely’ collide SHA-1 as SHA-1 was still very popular and widely used.”

A third expert in computer architecture and security, who requested anonymity due to the sensitivity of the documents and a concern for their future livelihood, told The Intercept that “most likely, the system is intended for brute-forcing password-protected data,” and that it “might also have applications for things like … breaking older/weaker (1024 bit) RSA keys.” Although there’s no explicit reference to a particular agency in the documents, this expert added, “I’m assuming NSA judging by the obvious use of the system.”

Huang and Koç both speculated that aside from breaking encryption, WindsorGreen could be used to fake the cryptographic signature used to mark software updates as authentic, so that a targeted computer could be tricked into believing a malicious software update was the real thing. For the NSA, getting a target to install software they shouldn’t be installing is about as great as intelligence-gathering gifts come.

The true silver bullet against encryption, a technology that doesn’t just threaten weaker forms of data protection but all available forms, will not be a computer like WindsorGreen, but something that doesn’t exist yet: a quantum computer. In 2014, the Washington Post reported on a Snowden document that revealed the NSA’s ongoing efforts to build a “quantum” computer processor that’s not confined to just ones and zeroes but can exist in multiple states at once, allowing for computing power incomparable to anything that exists today. Luckily for the privacy concerned, the world is still far from seeing a functional quantum computer. Luckily for the NSA and its partners, IBM is working hard on one right now.

Repeated requests for comment sent to over a dozen members of the IBM media relations team were not returned, nor was a request for comment sent to a Department of Defense spokesperson. The NSA declined to comment. GCHQ declined to comment beyond its standard response that all its work “is carried out in accordance with a strict legal and policy framework, which ensures that our activities are authorised, necessary and proportionate, and that there is rigorous oversight.”

https://theintercept.com/2017/05/11/nyu-accidentally-exposed-military-code-breaking-computer-project-to-entire-internet/

Stop Tesla Model 3 before we lose our riches!

Stop Tesla Model 3 before we lose our riches!

Greetings, oil product consumer. The time has come to make a decisive choice. Listen to your pumping heart, and let not our riches of your labor be lost to this Model 3!

Farid, IOBA Chairman & StopTesla.com founder

We want to warn every single individual on this planet about the planned release of Tesla Model 3* and the dangers that it unleashes towards international fossil fuel corporations.

*) the infamous fast, long-range, environment-friendly and offensively affordable fully electric auto-piloted car from Musk

http://stoptesla.com/

Prognosis negative: How California is dealing with below-zero power market prices | Utility Dive

Prognosis negative: How California is dealing with below-zero power market prices | Utility Dive

Market forces and contract obligations are regularly dipping power prices below zero in California.

The dynamic is not new — negative pricing has occurred sporadically across the country for decades. But now, expanded renewable energy production, especially in the West, is prompting a new round of more consistent negative pricing.

“Negative pricing is driven by a hard-to-fathom dynamic in any efficient market,” said Jeff Bladen, the Market Services Executive Director for the Midcontinent Independent System Operator (MISO). “At times, it is more efficient for energy producers to give energy away free or even pay consumers to take their power plants’ generation than to curtail production because the shutdown and startup of the plant may cost them more.”

To counteract overproduction and negative pricing, grid operators can order the curtailment of utility generation, thermal or renewable.

Until recently, the frequency of negative pricing events was declining around the nation as transmission was built out and grid operators learned better techniques to integrate variable renewable generation.

But this year, western power systems, particularly the California ISO, have seen a boom in negative pricing incidents as flush hydro reserves from a rainy winter come together with an ever-expanding base of intermittent solar generation. Even with persistent curtailment of renewable energy, the average CAISO real-time electricity price dipped below zero twice a day in March.

The negative pricing threatens market revenues for traditional generators, sparking concern from some that flexible gas plants needed to balance out wind and solar production may have to shut down, as the La Paloma plant did last year. Combined with the desire to maximize renewable energy output lost output of renewable energy to curtailment, the concerns have policymakers discussing ambitious market fixes to keep power prices in the black.

Worst in the West

Curtailment of renewable energy by the California Independent System Operator (CAISO) rose steadily in the second half of 2016 as solar penetration reached new highs, according to the grid operator’s March market report.

Curtailment reached record levels during California’s rainy winter as its hydropower supply rose 180% over 2016, said Guillermo Bautista Alderete, CAISO’s Market Analysis Director.

“Of the 288 daily 5-minute intervals, an average of 31% were curtailed in the first three months of 2017,” Alderete said. In 2015, 15% were curtailed; in 2016, that rose to 21%.

This year’s average curtailment is likely to drop after a drier summer and fall, but remain above previous years, he added.

The negative pricing threatens market revenues for traditional generators, sparking concern from some that flexible gas plants needed to balance out wind and solar production may have to shut down, as the La Paloma plant did last year. Combined with the desire to maximize renewable energy output lost output of renewable energy to curtailment, the concerns have policymakers discussing ambitious market fixes to keep power prices in the black.

Worst in the West

Curtailment of renewable energy by the California Independent System Operator (CAISO) rose steadily in the second half of 2016 as solar penetration reached new highs, according to the grid operator’s March market report.

Curtailment reached record levels during California’s rainy winter as its hydropower supply rose 180% over 2016, said Guillermo Bautista Alderete, CAISO’s Market Analysis Director.

“Of the 288 daily 5-minute intervals, an average of 31% were curtailed in the first three months of 2017,” Alderete said. In 2015, 15% were curtailed; in 2016, that rose to 21%.

This year’s average curtailment is likely to drop after a drier summer and fall, but remain above previous years, he added.

Explaining curtailment & negative prices

There are two ways to think about the demand-supply imbalance that causes curtailment and negative prices, according to Alderete. One is the operational challenge of matching supply and demand. The other is the market perspective.

Curtailment happens infrequently in day-ahead markets because supply and demand are balanced in advance. More often, it occurs in the real-time market when high production from subsidized wind and solar push down power prices, forcing traditional generators to choose between the costs of turning off or paying customers to take their power.

“Negative pricing signals there is too much generation,” Alderete said. “For some resources, it is too expensive to shut down so they continue generating, even when they have to pay to do so.”

The Brattle Group’s Hannes Pfiefenberger sees negative market prices as a result of policy decisions to support renewables. Wind’s $23/MWh federal production tax credit (PTC) and solar’s 30% federal investment tax credit (ITC) give them an edge over traditional generators on price, and California’s renewable energy mandates allow those resources to be dispatched first in the generation stack, giving them greater influence over power prices.

Other generators weigh the costs of paying customers to take their production against the costs of ramping down. The ITC gives solar a capital expenditure advantage, and rooftop solar typically cannot be curtailed by the grid operator. For wind, the PTC’s after-tax value of up to $37/MWh means operators can afford to sell as low as negative $35/MWh and still potentially benefit, Pfiefenberger said.

Wind and solar generators may also face large penalties for not delivering contracted renewables that utilities need to meet state mandates, he said.

In addition to growing wind and solar, California’s curtailment and negative pricing are worse this year because its normal power trading with the Pacific Northwest has been disrupted by an abundance of hydropower from the wet winter.

Under a 2012 FERC ruling, the Bonneville Power Administration (BPA) is authorized to aggressively curtail wind energy to keep generation flowing from its its eight-state hydropower system. The reason, Spokesperson David B. Wilson said, is that significant unanticipated hydropower curtailment could greatly affect river flows, damaging habitat for salmon and other wildlife.

The FERC decision also validated BPA’s long-standing policy of not taking negative bids for hydropower.

“Because of the large amount of publicly available hydro data, paying negative prices would allow other marketers to take advantage of BPA’s need to generate,” Wilson said.

As a result, the Pacific Northwest is curtailing wind to allow hydro to generate, while California is curtailing wind and solar and still experiencing negative pricing. The dual phenomenon has policy watchers searching for ways to prevent that clean power from going to waste.

Cameron Yourkowski, senior policy manager for Renewable Northwest, said these circumstances are revealing significant “market inefficiencies” that impose costs on both systems. Despite open transmission interties, they are forced to keep an estimated 7,000 MW of expensive fossil generation spinning to meet demand peaks, he said.

“Optimizing these things could result in better outcomes and a system operator could do that,” Yourkowski said.

Emerging solutions for the West

One way to better optimize the western power system would be through the expansion to a west-wide ISO, said Steven Greenlee a CAISO spokesperson.

The current energy imbalance market (EIM) can provide some alternative demand, Greenlee said. “But if there was a Western region market, we could optimize all the participating systems instead of having to do so much of it in the real-time market.”

That CAISO expansion initiative, however, is currently stuck in political stasis after California and neighboring states reached an impasse over governing issues. Unless state leaders can move past the talking point of giving up state sovereignty to a larger electricity market, it looks unlikely to proceed.

Renewable Northwest sees a shorter-term solution, Yourkowski said. At present, California’s rules on capacity payments to out-of-market generators keep natural gas plants idling in anticipation of peak demand.

“If those market rules were structured differently, wind and hydro in the northwest and California’s solar over-generation could replace the natural gas plants,” he said.

Under CPUC rules, capacity payments may go only to generators who bid into California’s real-time market, said Jim Caldwell, senior technical consultant for the Center for Energy Efficiency and Renewable Technologies (CEERT), who is working with Yourkowski.

Generators across the border in the Pacific Northwest do not have that capability.

If the CPUC were to devise a “work-around,” it could allow Northwest hydro to replace fossil fuels in California’s peak demand energy mix, he said. Instead of curtailing renewables at midday, California would export its solar over-generation to the Pacific Northwest, allowing BPA to hold back some of its hydro so that it would be available when California needs it. If BPA knows of the need to ramp down hydro in advance, it can plan releases so they do not harm wildlife habitat.

“California will use the Pacific Northwest like a giant battery,” Caldwell said. “It would not be frictionless, but it is manageable if they plan in advance.”

CAISO would have to make minor changes to its rules and practices, he added, “But we think we can get 80% or more of the benefit that we could from a regional market.”

Yourkowski agreed. The 400 MW of wind and hydro in the Northwest now delivered by the California-Northwest intertie for capacity adequacy could grow to 7,000 MW, he said.

“The new rules aren’t likely to be in place until next year but this year is revealing a lot about where work is needed to make the system more efficient,” Yourkowski said.

Another potential solution involves eliminating the stacking, or “pancaking,” of transmission tariffs as low-cost renewable generation is sent across isolated western balancing areas. That would also help reduce curtailment, Yourkowski said.

Nancy Kelly, a policy advisor at Western Resource Advocates (WRA), said that was the intent behind the formation of the Mountain West Transmission Group (MWTG).

The MWTG utilities include Basin Electric Power Cooperative, Black Hills Corp, Colorado Springs Utilities, Platte River Power Authority, Xcel Energy Colorado, Tri-State Generation and Transmission Cooperative, and Western Area Power Administration.

Transmission constraints that prevented Xcel from joining California’s EIM led to a plan from the MWTG to eliminate pancaking through a single tariff group, Kelly said. A Brattle study found the single tariff could save as much as $14 million a year — sizable, but not compared to the estimated benefits of between $53 million and $88 million per year from a regional market. MWTG first turned to California’s proposed regional market with its tariff proposal, Kelly said. When that effort was delayed, the utilities initiated ongoing talks with the Southwest Power Pool (SPP).

Curtailment for the rest

The curtailment and negative prices roiling California and Pacific Northwest markets are likely to resolve with warmer, drier weather, said Michael Goggin, research director for the American Wind Energy Association.

“The longer-term solution is expanding transmission capacity so high output of any type — wind, solar, or hydro — can be moved to where power is needed,” Goggin said.

Much Western transmission capacity goes unused because bilateral contracts between power producers and buyers, which are the bulk of western energy transactions, require that lines be kept open, Goggin said. Contracts also bypass the price signals that streamline markets’ competitive bidding.

There has been some curtailment and negative pricing in the western parts of the PJM, SPP, and Electric Reliability Council of Texas (ERCOT) markets, Goggin said. But, as detailed in the most recent wind market report from Lawrence Berkeley National Laboratory (LBNL), it has dropped significantly with the addition of transmission capacity, he added.

“Only 1.0% of potential wind energy generation within ERCOT was curtailed in 2015, down sharply from 17% in 2009,” LBNL reported. The main reasons, LBNL added, are transmission line capacity growth and more efficient wholesale markets.

Those market dynamics in ERCOT may be changing, however. As the penetration of subsidized wind and solar increases, the grid operator’s market is seeing very low and negative real-time pricing more frequently, stirring concerns among market observers that it may not provide sufficient incentives to site new generation in the future.

Just this week, generators NRG and Calpine filed a proposal with ERCOT to change pricing and settlement rules, saying that the wind PTC began having a significant impact on prices in 2016. The generators argued the grid operator should consider alternatives to its current socialized transmission planning process to avoid “subversion” of the market model.

It’s not a mess; it’s a market

Most stakeholders reached by Utility Dive agreed markets fixes could go a long way to correcting the negative pricing and curtailment in the West.

The MWTG set out to establish a single tariff, Kelly said, “but realized there are far greater benefits from joining a regional market.”

BPA’s Wilson sharply disagreed. If a market can minimize curtailments, “there may be a small benefit,” he said. But “it is unlikely that organized markets are going to be able to consistently uncover large amounts of generation or load flexibility that existing bilateral markets haven’t already found.”

Brattle’s Pfiefenberger said it is a matter of cost. “The more renewables you curtail, the more costly renewables become because a bilateral market is just not nimble enough.”

CAISO’s Alderete argued against characterizing negative pricing and curtailment as a failure of the market. Instead, the market is doing exactly what it is designed to do, he said. The problem is that the design no longer fits the grid’s needs

“In the past, California’s main concern was having adequate capacity,” he said. “Today, the main concern is having adequate flexible capacity.”

http://www.utilitydive.com/news/prognosis-negative-how-california-is-dealing-with-below-zero-power-market/442130/