Thursday, 27 July 2017

Swedish Government On Verge Of Collapse

Swedish Government On Verge Of Collapse After Admitting 'Accidental Leak' Of Entire Nation's Info



26 July, 2017

Sweden’s government is being quietly rocked by a major scandal. So far the market has paid it little regard, but, as Citi warns, it may soon escalate...


Sweden appears to have accidentally leaked the details of almost all of its citizens. And now it's getting worse as the leak happened in 2015, but only emerged last week. As The Indepedent reports,







The leak allowed unvetted IT workers in other countries to see the details of people registered in Swedish government and police databases.
 
It happened after the government looked to outsource data held by the Transport Agency, but did so in a way that allowed that information to be available to almost anyone, critics have claimed.

The opposition is seeking to boot out the ministers of infrastructure, defence and the interior – Anna Johansson, Peter Hultqvist and Anders Ygeman, respectively – for their role in outsourcing IT-services for the Swedish Transport Agency in 2015.

Prime Minister Stefan Lofven admitted Monday his country and its citizens were exposed to risks by potential leaks as a result of the contract.







"This is a disaster," Swedish PM Lofven said. "This has exposed Sweden and Swedish citizens to risks."
 
The minority government has said that contract process – won by IBM Sweden – was speeded up,bypassing some laws and internal procedures in a manner that may have led to people abroad, handling servers with sensitive materials.


As Reuters reports, the scandal has raised questions about the way it has been handled within the government.The security police informed the Justice Ministry in late 2015 but Lofven said he only found out about it early this year.







Lofven said Anna Johansson, minister of infrastructure and responsible for the Transport Agency, had not passed information on to him.
 
Johansson on Sunday in turn blamed one of her former state secretaries for not informing her about the scandal.
 
"I wish I had been informed earlier," Lofven said while adding he had no plans to fire any ministers. "I have full confidence in them (ministers) until I say otherwise."

However, as The Independent notes, the centre right opposition Alliance, comprising the Moderate, Centre, Liberal and Christian Democrat parties, has taken aim at Lofven's cabinet...







"It is obvious (they) have neglected their responsibility. They have not taken action to protect Sweden's safety", Centre party leader Annie Loof told a news conference.

Parliament is in recess but the opposition parties will submit a request to the speaker to summon legislators for a vote within 10 days.
If the opposition gets a majority the ministers will have to resign, a likely outcome as the nationalist Sweden Democrats have said they will support a vote of no-confidence.







"There are only two alternatives, either a new election or he himself (Lofven) resigns," Sweden Democrats leader Jimmie Akesson said.
 
"It feels like it's possible, yes," Anna Kinberg Batra, leader of the Moderates said.


As Citi notes, SEK has strengthened so far this week and no dramatic headlines seem to be impacting the currency yet, but some political noise may be on the way.

IBM Sweden says it never comments on relations with clients. The government said it had no comment pending a later statement, which Swedish Prime Minister Lofven just announced that he will hold a press conference on July 27th at 10am CET.

Finally we wonder how long before Sweden blames Russia for leaking this 2-year-old secret


Michael mann issues a corrective about global temperature increases

This is preposterous because Michael Mann offers up a corrective that indicates that a considerable warming already occurred before measurements started.

The amount that I believe to be true is 1.6C since the onset of the industrial age.

Mann is still the man if you want reasssuring lies. He is contadicting what he said  on an interview on the Real News to boot.

The idea of an available carbon budget that we have available of course is absolute nonsense.There is, according to David Wasdell, about 6C of warming implicit inthe 400 ppm of CO2 currently in the atmosphere.

Chris Mooney is the journalist who dumped on Paul Beckwith over a mere suggestion (true) that the jet stream is crossing the equator.

We may have even less time to stop global warming than we thought
By Chris Mooney

24 July, 2017

At least since 2013, one of the biggest concerns in the climate change debate has been the so-called carbon budget — a fixed limit to the volume of carbon dioxide emissions that we can put into the atmosphere before irrevocably committing to a considerably hotter planet.

As of 2011, that budget was about 1,000 billion tons of carbon dioxide before the planet is likely to careen past a 2 degrees Celsius (3.6 degrees Fahrenheit) rise in temperatures, which is above what is believed to be the Earth’s temperature before industrialization. The budget shrinks by about 41 billion tons a year, more recently putat about 600 billion tons (or 15 years of emissions) by a group of scientists and climate policy wonks.

But now, a team of prominent climate scientists say the budget is probably even narrower. The problem is how you define “preindustrial,” or when you consider human-caused perturbations of the atmosphere to have begun.

Many analyses have taken the late 19th century as the starting point, but the new study in Nature Climate Change suggests significant human influence was afoot by at least 1750, and may have contributed as much as one-fifth of a degree Celsius of warming (0.36 degrees Fahrenheit) before the late 1800s.

Frankly, this study does indicate that it may be more of an uphill battle than we previously thought in order to stabilize warming below the commonly defined dangerous limit of 2 degrees Celsius,” said Pennsylvania State University’s Michael Mann, one of the study’s authors. He completed the research with scientists from the universities of Edinburgh and Reading in the United Kingdom.

Defining what counts as “preindustrial” can be a bit of a moving target in climate research, but when the United Nations’ Intergovernmental Panel on Climate Change outlined the carbon budget in 2013, the group said that it was analyzing warming that had occurred “since the period 1861—1880.” But if the world had already warmed by a few slivers of a degree before then, that shrinks the carbon budget by “as much as 40 % when earlier than nineteenth-century climates are considered as a baseline,” notes the new paper.

To be sure, carbon budgets are only estimates — a way of trying to quantify the likelihood or risk of crossing 2 degrees Celsius for a given amount of emissions. The safer you want to be, the tighter the budget becomes. But for all carbon budgets, if you’re two-tenths of a degree closer to the threshold than you thought, the risk of tipping over is certainly higher.

Mann said that between the start of the industrial revolution in England in the 18th century and the late 19th century — when reliable thermometer records begin (by which time that revolution had spread to other countries) — humans may have added 30 or 40 parts per million of carbon dioxide to the atmosphere.

But there’s a lot of uncertainty here. The scientists don’t know precisely how much the planet warmed between the true start of industrialization and the late 19th century, when it was really starting to hum. Temperature records get spottier the farther you go back, which is one key reason that the late 19th century has generally been considered as the temperature baseline. Influential temperature data sets, like NASA’s, begin in this period (NASA’sstarts with the year 1880).

The new research considers a variety of possibilities for how much temperatures rose in the early industrial revolution, generally in the range of just a few hundredths of a degree Celsius to about two-tenths of one. One-tenth of a degree would also raise the risk of breaching 2 degrees C, but not by as much.

Naturally, taking fuller account of this preindustrial warming also makes it much more likely that we’ll pass 1.5 degrees Celsius of warming. That’s an extremely challenging target for us to hit in limiting warming that many observers and analysts have already written off, although it is cited as a more lofty goal in the Paris climate agreement.

It sort of takes 1.5 degrees Celsius off the table in the absence of active carbon removal,” said Mann, referring to possible technologies capable of actively withdrawing carbon dioxide from the atmosphere.

The analysis also greatly depends on how much humanity does or doesn’t change its behavior in coming years. If we keep emitting willy-nilly and follow what is often called a business-as-usual warming path, then a precise measurement of warming before the late 1800s won’t matter that much. We’ll blow past 1.5 and 2 degrees Celsius warming targets no matter what.

But if we’re actually beginning to curb emissions with the aim of hitting these goals — and there are some hints that we are — then a measurement of warming before the late 1800s really does matter. The risk of tipping over the line becomes greater — because you’re already closer to that line.

It is also important to bear in mind that the 2 degree Celsius target is a “normative goal, a value judgment,” said Reto Knutti, a climate expert with ETH Zurich who was familiar with the new study but did not contribute to it.


There is no magic hard threshold that separates ‘safe’ from ‘dangerous.’ Not all impacts scale with temperatures, and what is dangerous to one person may seem okay to another,” Knutti said. “This is only partly the science issue of trying to quantify how warm the world was before humans started to substantially mess with the climate. It is just as much a political problem: If countries at some point are made responsible not just for their current but also for their past emissions (the polluter pays), then it matters when we start the historic blame game.”

Climate sensitivity larger "than previously thought" (by the computer modellers)

I can now understand it now why our retired professor from Victoria University did not like it at all when I mentioned Keith Trenberth.

He is quoted in this article as being critical of the computer models as being too conservative.

The modellers don't like that. They hate Prof. Peter Wadhams.





Scientists just found a surprising possible consequence from a very small amount of global warming

By Chelsea Harvey




24 July, 2017

Even if we meet our most ambitious climate goal — keeping global temperatures within a strict 1.5 degrees Celsius (or 2.7 degree Fahrenheit) of their preindustrial levels — there will still be consequences, scientists say. And they’ll last for years after we stop emitting carbon dioxide into the atmosphere.

New research suggests that extreme El Niño events — which can cause intense rainfall, flooding and other severe weather events in certain parts of the world — will occur more and more often as long as humans continue producing greenhouse gas emissions. And even if we’re able to stabilize the global climate at the 1.5-degree threshold, the study concludes, these events will continue to increase in frequency for up to another 100 years afterward. The findings were published Monday in the journal Nature Climate Change.

It was really a surprise that what we find is after we reach 1.5 degrees Celsius and stabilize world temperatures, the frequency of extreme El Niño continued to increase for another century,” said Wenju Cai, a chief research scientist at Australia’s Commonwealth Scientific and Industrial Research Organization and one of the study’s lead authors. “We were expecting that the risk would stabilize.”

The study builds on a 2014 paper, also published in Nature Climate Change by Cai and a group of colleagues, which first suggested that extreme El Niño events will increase with global warming. That paper focused on a business-as-usual climate trajectory, in which greenhouse gas emissions remain at high levels into the future, Cai noted. It found that under this scenario, the frequency of extreme El Niño events would double from their preindustrial levels within this century.

The 2014 paper produced mixed responses among scientists at the time. Some experts, including Kevin Trenberth of the National Center for Atmospheric Research, suggested the models they used may not accurately simulate the behavior of El Niño.

Nevertheless, after the Paris climate agreement was finalized, and the 1.5-degree temperature goal was established, the researchers were interested in revisiting their previous work. This time, they specifically investigated the way El Niño would be affected if the world actually managed to stay within this climate threshold, a target that many scientists believe is already close to slipping through our fingers. Recent research has suggested that we’re on track to overshoot this climate goal within the next few decades.

During a typical El Niño event, Cai said, parts of the central and eastern tropical Pacific Ocean become warmer than usual, causing changes in wind patterns and rainfall in certain places around the world. Often, the consequences include warming over the western Americas and increased rainfall in the tropical Pacific. During an “extreme” El Niño event, these warming patterns tend to be shifted even further toward the east and the equator, forming a zone near the coast of Ecuador where intense amounts of heat transfer between the ocean and the atmosphere. The results tend to include even more intense rainfall in the region than usual, sometimes up to 10 times the typical amount, Cai said.

The researchers used a collection of 13 climate models to simulate a scenario in which global carbon dioxide emissions peak around the year 2040 and then decline, a trajectory that would keep the world within the 1.5-degree threshold. They then took note of how frequently these extreme events occurred in the simulations.

The models suggested that by the time we hit the 1.5-degree mark, the frequency of extreme El Niño will have doubled from its preindustrial level of about five events every 100 years to about 10. This increase will occur steadily over time, the researchers note, meaning that any additional increase in carbon dioxide in the future will lead to an increased risk of an extreme event.

This effect does increase slightly under stronger climate scenarios — the researchers report that under a 2-degree climate threshold, the increase in frequency is a bit stronger. But overall, each scenario produces approximately double the preindustrial frequency during this century, even if the effect is a bit larger under more severe trajectories. This is in keeping with the 2014 research, which suggests that under a business-as-usual climate scenario, the frequency of extreme El Niño events will also approximately double before the end of the century.

But the consequences won’t stop when we reach 1.5 degrees. The study suggests that the frequency of extreme El Niño events will continue to increase (although at a slower rate) even after global temperatures stabilize, potentially for up to another 100 years. These findings are less firm, since not all the models are capable of projecting beyond the end of the century. But several of them indicate that by the year 2150, the frequency will have grown to about 14 events per 100 years.

The researchers noted that the same results did not hold true for La Niña events, which often produce the opposite effects of El Niño. While previous research has suggested that more intense warming scenarios may lead to more frequent La Niña events as well, the milder climate trajectory in this study did not produce any significant changes.


Trenberth, who was not involved with the research, still has concerns about the models used in the research, which he says are “the same flawed models used before.” He argues that the models do a poor job of capturing some of the impacts of El Niño events — even “regular” ones — and the way they’re influenced by temperature and moisture in the atmosphere.

But Cai says he believes the study’s results are “believable” and that there are mechanisms to explain them. Because of the influence of climate change, the eastern equatorial Pacific is warming quickly, he said. As a result, it’s becoming easier for the critical centers of convection, or heat exchange, which affect global weather patterns, to move from west to east across the Pacific as they do during El Niño events.

The timing of El Niño events in the future will depend on factors including natural climate variations and weather patterns. Scientists are still working on figuring out better ways to predict El Niño before it hits, but for the time being, it’s often difficult to see it coming too far in advance. But over the course of a century, the study suggests we’ll see more of them as the climate continues to warm — and even after it stabilizes — even if we don’t know exactly when they’ll be coming.

And Cai noted that the findings also beg the question of what other types of climate effects might continue to evolve long after we stop emitting carbon dioxide into the atmosphere, whenever that may be. If El Niño is so severely affected, even at a 1.5-degree threshold, fluctuating temperature patterns in the Indian and Atlantic oceans may also be at risk of long-term changes under global warming, Cai suggested.


Those are the questions scientists need to ask,” he said.

Record levels of atmospheric methane measured

Highest levels of methane recorded in history at Mauna Loa, Hawaii


Atmospheric methane update July 2017. NOAA data used.



Equatorial Levels at Mauna Loa, Hawaii represent the highest July readings in recorded history at 1850 ppb.

Arctic Levels at Barrow, Alaska are around 1920 ppb and are likely to equal last 

year's historic levels.




Note that methane levels always decline during the Northern Hemisphere summer, but the overall direction is up.


Data sources are:


Mauna Loa: 


Barrow:



(CH4, In situ data for both)

I have updated the Wikipedia pages on Atmospheric Methane and Arctic Methane Emissions.

 
and

https://en.wikipedia.org/wiki/Arctic_methane_emission

Melted fuel found at Fukushima - Corium up to 6 feet thick below reactor

Expert: Melted fuel found at Fukushima — Corium up to 6 feet thick below reactor — Nuclear waste “piling up at bottom” — Lava-like material has spread all over… “hanging like icicles” — Mystery orange substance seen (VIDEO)


ENENews,

24 July, 2017


Kyodo, Jul 22, 2017 (emphasis added)): In big step forward, Tepco finds melted fuel at bottom of reactor 3 in Fukushima… The debris was clearly identifiable to at least one nuclear expert. “The images that appear to be melted fuel debris match those found in the (1986) Chernobyl crisis,” said Tadashi Narabayashi, a specially appointed professor of nuclear engineering working at Hokkaido University. “It’s definitely fuel debris… It’s an epoch-making event.”


New York Daily News, Jul 22, 2017: Underwater robot captures images of melted fuel at wrecked Fukushima nuclear plant — An underwater robot captured photos of 3-foot thick lumps of melted nuclear fuel covering the floor

Sky News, Jul 24, 2017: Melted nuclear fuel spotted in Fukushima reactor — The radioactive material has been spotted and pictured by a submersible robot…
CNN, Jul 24, 2017: [The robot] has revealed appears to be stalactites of melted nuclear fuel, [Tepco] said… the robot sent back 16 hours worth of images of massive, lava-like fuel deposits

AP, Jul 23, 2017: [Images] showed massive deposits believed to be melted nuclear fuel covering the floor

Asahi Shimbun, Jul 23, 2017: Melted nuke fuel images show struggle facing Fukushima plant— Images captured on July 22 of solidified nuclear fuel debris at the bottom of a containment vessel of the crippled Fukushima No. 1 nuclear power plant show the enormity of decommissioning of the facility… [TEPCO] also discovered that the nuclear fuel debris has spread throughout the containment vessel.

AP, Jul 22, 2017: [TEPCO] said the robot found large amounts of lava-like debris apparently containing fuel that had flowed out of the core… TEPCO spokesman Takahiro Kimoto said it was the first time a robot camera has captured what is believed to be the melted fuel. “That debris has apparently fallen from somewhere higher above. We believe it is highly likely to be melted fuel or something mixed with it,” Kimoto said…

Kyodo, Jul 23, 2017: The robot was sent closer to the bottom of the reactor on Saturday and found possible fuel debris scattered in a wide area.
Japan Times, Jul 21, 2017: Fukushima robot finds potential fuel debris hanging like icicles in reactor 3… The objects spotted this time look like icicles… Tepco is pinning its efforts on technology not yet invented to get the melted fuel out of the reactors.

Reuters, Jul 21, 2017: Tepco detected black-colored material that dangled like icicles that could be nuclear debris near the bottom of the reactor’s pressure vessel that contained the fuel rods, the report said, citing unnamed sources.
Bloomberg, Jul 21, 2017: New images show what is likely to be melted nuclear fuel hanging from inside one of Japan’s wrecked Fukushima reactors… [Tepco] released images on Friday showing a hardened black, grey and orange substance

Financial Times, Jul 24, 2017: [Kimoto] was reluctant to speculate on the nature of seemingly corroded orange patches in the images.

NHK, Jul 23, 2017: [TEPCO] says Saturday’s probe found lumps that are highly likely to be fuel debris piling up at the bottom of the containment vessel… The deposits are estimated to beone to two meters thick. Images released on Saturday show black, rock-like lumps and what appear to be pebbles and sand accumulating at the bottom.


From 2014

Studies show multiple fuel cores ejected from Fukushima reactors – Hot particles of uranium and plutonium fuels detected nearly 300 miles away



27 August, 2014

Marco Kaltofen, Nuclear Science and Engineering , presented at Worcester Polytechnic Institute, March 19, 2014: High Radioactivity Particles in Japanese House Dusts… The Fukushima Dai‐ichi accident released very high activity inhalable dust particles that travelled long distances… Airborne dusts can transport radioactive materials as isolated individual particles containing high concentrations of radioisotopes. Alpha and beta emissions related to fission wastes and dispersed fuel particles are hazardous when inhaled or ingested. Radioactively‐contaminated environmental dusts can accumulate in indoor spaces, potentially causing significant radiation exposures to humans via inhalation, dermal contact, and ingestion… a micron‐scaled particle [had] activity greater than 1.0 PBq kg [1 Quadrillion Bq/kg]. The par6cle was collected from a home in Nagoya, Japan. Nagoya is 460 km from the accident site… It contained both fission products and decay products of 238U… tellurium up to 48.0 %, cesium up to 15.6 %, rubidium up to 1.22 %, polonium up to 1.19 %, dysprosium up to 0.18 %, as well as trace amounts of Sn, lead, nickel, iron, and chromium… 226Ra, 134Cs, and 137Cs, 241Am, and 230Th [were] the most commonly detected gamma photon-emitting isotopes… about 25 % of dusts sampled [were] autoradiographically positive for hot particles… the majority of these hot particles were 10 um [micrometers] or less in size, meaning that they were potentially inhalable… Radioactively‐hot particles on the respirable size range were routinely detected, with one as far as 460 km [285 miles] from the release site.
Kaltofen : Radioisotopes in dusts released by Fukushima Daiichi units [include] Uranium and plutonium fuels and transuranics such as americium and neptunium… individual radioactive particles [in an] Ibaraki dust sample [include] Eu, Y, Zr, Th, Ce, Sr… in 1 to 15 um size range…
Kaltofen : The Japanese samples came from as far north as Sapporo in Hokkaido Prefecture and as far south as Tokyo, a range of 780 km. Fifty nine samples of dust from Japan were analyzed… Radioisotopes specific to the Fukushima Daiichi accidents, including Cs134, Cs137, and Co60 were detected in dust samples taken throughout Northern Japan, including areas more than 200 km outside of the accident exclusion zone. Cs134 was detected at all of the Japanese sites tested… Japanese samples… analyzed in the first month after the accident also contained I131 and Am241… Radioactive dust has become a ubiquitous part of life in northern Japan.
Chris Harris, former licensed Senior Reactor Operator & engineer, Aug 21, 2014 (at 24:00 in): NHK just [broadcasted] that many studies are showing… that multiple cores — parts of it, or some, or even most of it — had been ejected. We thought that too. Once you breach containment, that was one of my big concerns — where did the core go after an explosion like? Whether it be steam or hydrogen explosion or a combination of both… it got ‘sneezed out’ all over the place. It’s totally – it’s a huge mess.
Source: Enenews