AXON has a 13-megapixel camera on the back and a secondary 2-megapixel camera that sits above it to help capture 3D depth for a refocusable photo. To get a better idea of the story behind the challenging work, Mobile World talked with two engineers from ZTE’s camera lab.
Interviewee: Zhu Yufei, responsible for the North American camera team and camera solution planning
Why did you decide to introduce a dual lens camera into AXON? How will the smartphone camera develop in the future?
Dual-lens camera technology is prospective. Photos taken by a single-lens camera are some kind of object simulation, while photos taken by a dual lens camera are more vivid because dual lens cameras capture more information. ...In terms of intelligence, we improve image stabilization and low-light photography through better algorithms and relevant technologies on AXON.
Are there any differences between ZTE’s dual-lens camera and other similar products?
The distance between the two lenses of AXON is wider. AXON outperforms other dual-lens smartphones in the large aperture effect, depth calculation accuracy, and distance of objects from the camera.
What are the trends for future smartphone cameras?
Both single-lens cameras and dual-lens cameras will coexist. Single-lens cameras will dominate in the near future, and dual lens cameras will continue to improve.
Interviewee: Xiao Longan, responsible for the camera software development
What are the tough nuts to crack in introducing the dual-lens camera to AXON? How did you overcome these difficulties?
...No smartphone project at ZTE has required so many human resources on the R&D of the camera app, and no camera app has received so much attention.
Which camera feature do you like most?
Background blur and refocus...
Can you explain more about how the refocus feature works?
The dual-lens camera captures and adds depth data to each object in the image. These data allows you to refocus a picture and add a blur effect to select parts of the image.
The year 2015 is shaping up to be the warmest year on record. In the media, a lot of attention has been given to the many floods, droughts, wildfires and heatwaves that have battered the world this year.
Sadly, though, little attention is given to the situation in the Arctic. The image on the right shows a forecast for December 30, 2015, with temperatures at the North Pole above freezing point, as further illustrated by the nullschool.net image below, showing a temperature forecast of 1.1°C or 34.1°F for the North Pole. Wind speed at the North Pole is forecast to be 105 mph or 168 km/h on December 30, 2015, and 133 mph or 215 km/h closer to Svalbard.
As the image below illustrates, very high temperatures are forecast to hit the Arctic Ocean on December 30, 2015.
Above image shows temperature anomalies at the highest end of the scale for most of the Arctic Ocean, with a temperature anomaly for the Arctic as a whole of 2.4°C or 4.32°F above what was common in 1979-2000. The situation isn't likely to improve soon. For January 3, 2016, the temperature in the Arctic is forecast to be as much as 4.56°C or 8.21°F warmer.
How is it possible for such high temperatures to occur over the Arctic Ocean? The image below shows how the year 2015 is shaping up in terms of temperature anomalies.
Global warming is felt most strongly in the Arctic as warming continues, as illustrated by above image and by the image on the right.
Warming in the Arctic is accelerating due to feedbacks. One of these feedbacks is the way the jet streams are changing. Changes in the jet streams are becoming more prominent as the Arctic is warming up more rapidly than the rest of the world.
As the difference in temperature between the Arctic and the equator becomes smaller, the speed at which the jet stream circumnavigates the globe is decreasing and jet streams become more wavy.
Meanwhile, most of the extra heat caused by global warming goes into the oceans, and the Atlantic Ocean is warming up fast. At the same time, meltwater is accumulating at the surface of the North Atlantic, lowering sea surface temperatures there. With such large differences between high temperatures over North America and lower temperatures over the North Atlantic, the speed of the jet stream between those places can increase dramatically.
The result is that huge amounts of warm air are being pushed high into the Arctic. The image on the right shows the jet streams on December 27, 2015, when speeds as high as 263 mph or 424 km/h were reached at the location marked by the green circle. Also note the jet streams crossing the Arctic at the top of the image, while crossing the equator at the bottom of the image.
The image below shows sea surface temperature anomalies on the Northern Hemisphere in November.
For over a month now, storms over the North Atlantic have been pushing hot air high up into the Arctic. The video below uses surface wind content by Climate Reanalyzer (selected daily averages and sequences of forecasts) to cover the period from December 5, 2015, to January 8, 2016.
Best wishes for 2016
Above video stops at January 8, 2016, when two cyclones are visible, one in the North Atlantic and another one over the North Pacific, prompting me to create the image on the right.
What causes these storms to grow this strong? Waters keeps warming up dramatically off the east coast of North America. Emissions from North America tend to extend over these waters, due to the Coriolis effect, and this contributes to their extreme warming.
The image below shows carbon dioxide levels as high as 511 ppm over New York on November 5, 2015, and as high as 500 ppm over the water off the coast of coast of New Jersey on November 2, 2015.
Emissions contribute to warmer waters - click to enlarge
The top panel of the image on the right shows that on December 11, 2015, carbon dioxide levels were as high as 474 ppm (parts per million, surface concentration) at the location marked by the green circle in New York.
The bottom panel of the image on the right shows that the water off the coast was warmer by as much as 10.3°C or 18.5°F at the location marked by the green circle on December 11, 2015.
The NASA video below shows carbon dioxide emissions over the year 2006.
It's not just CO2 off the North American coast that contributes to further warming of the Gulf Stream, many other emissions do so, including methane, CO, etc. Carbon monoxide (CO) is not a greenhouse gas, but it depletes hydroxyl, thus preventing oxidation of methane, a very potent greenhouse gas. The animation below shows a carbon monoxide level at green circle of 528 ppb on December 28, 2015, 0900z, while the sea surface temperature anomaly there was 15.8°F or 8.8°C on that day.
Carbon monoxide reached much higher levels recently over land, as illustrated by the image below that shows a CO level of 2077 ppb in New York on January 6, 2016.
These emissions heat up the Gulf Stream and make that ever warmer water is carried underneath the sea surface all the way into the Arctic Ocean, while little heat transfer occurs from ocean to atmosphere, due to the cold freshwater lid on the North Atlantic.
The image on the right shows that it was warmer by as much as 9.6°C or 17.2°F near Svalbard on December 25, 2015, at the location marked by the green circle. The same anomalies were recorded on December 26, 2015, when the temperature of the water there was 11°C or 51.9 °F.
This gives an indication of how warm the water is that is being pushed underneath the sea surface into the Arctic Ocean.
Strong winds and high waves can cause more sea ice to be pushed along the edges of Greenland out of the Arctic Ocean, into the Atlantic ocean, expanding the cold freshwater lid on the North Atlantic, in a self-reinforcing feedback loop.
The image below shows the impact of these storms on sea ice speed and drift on December 31, 2015 (left) and a forecast for January 8, 2016 (right).
The danger is that, as warmer water reaches the seafloor of the Arctic Ocean, it will increasingly destabilize sediments that can contain huge amounts of methane in the form of free gas and hydrates.
Methane levels over the Arctic Ocean are already very high. Above image shows methane levels as high as 2745 ppb over the Arctic Ocean on January 2, 2016. High releases from the Arctic Ocean seafloor are pushing up methane levels higher in the atmosphere, as discussed in earlier posts such as this one.
So, while the extreme weather events that have occurred in the year 2015 are frightening, even more terrifying is the way the water of the Arctic Ocean is warming up. Sadly, this is rarely even discussed in the media. So, let's once more add the image below that should have been given more media attention.
The situation is dire and calls for comprehensive and effective action as described at the Climate Plan.
The year 2015 is shaping up to be the warmest year on record. In the media, a lot of attention has been given to the... Posted by Sam Carana on Monday, December 28, 2015
In a recent article in Scientific American, reporter David Biello summarizes the current state of carbon-capture technology, and it's not good.If a negative view of carbon capture appeared in some obscure climate-change-denier publication, it could be dismissed as biased reporting.But the elite-establishment Scientific American has been in the forefront of the anti-climate-change parade, and so for such an organ to publish such bad news means that we would do well to take it seriously.
The basic problem is that capturing a gas like carbon dioxide, compressing it, and injecting it deep enough underground where it won't come out again for a few thousand years is not cheap.And the worst fossil-fuel offenders—coal-fired power plants—make literally tons of the stuff every second.It would be hard enough to transport and bury tons of solid material (and coal ash is a nasty enough waste product), but we're talking about tons of a gas, not a solid.Just the energy required to compress it is huge, and the auxiliary operations (cleaning the gas, drilling wells, finding suitable geologic structures to hold it underground) add millions to billions to the cost of an average-size coal-fired plant.Worst of all, the goal for which all this effort is expended—slowing carbon-dioxide emissions—is a politically-tinged goal whose merit is doubted by many, and which is being ignored wholesale by some of the world's worst offenders in this regard, namely China and India.
However, shrinking the U. S. carbon footprint is regarded by many as a noble cause, and a few years ago Mississippi Power got on the bandwagon by designing a new lignite-burning power plant to capture its own carbon-dioxide emissions and send them into a nearby oil field, whereupon they expel oil that is, uh, eventually burned to make more carbon dioxide.Here is the first irony.Evidently, one of the few large-scale customers for large quantities of carbon dioxide are oil companies, who send it underground (good) to make more oil come to the surface (not so good).
The second irony is an economic one.It is the punishment meted out by economics to the few good corporate citizens in a situation where most citizens are not being so good.
Currently in the U. S., there is no uniform, rational, and legally enacted set of rules regarding carbon-capture requirements.So far, the citizenry as a whole has not risen up and said, "In our constitutional role as the supreme power in the U. S., we collectively decide that capturing carbon dioxide is worth X billion a year to us, and we want it done pronto."Instead, there is a patchwork of voluntary feel-good individual efforts, showcase projects here and there, and large-scale operations such as the one Mississippi Power got permission to do from the state's utility commission, as long as they didn't spend more than $2.88 billion on the whole thing.
So far, it's cost $6.3 billion, and it's still not finished.This means big problems for the utility and its customers, in the form of future rate hikes.Capturing carbon is not a profitable enterprise.The notion of carbon-trading laws would have made it that way, sort of, but for political reasons it never got off the ground in the U. S., and unless we get a world government with enforcement powers, such an idea will probably never succeed on an international level.So whatever carbon capturing is going to be done, will be done not because it is profitable, but for some other reason.
The embarrassment of Mississippi Power's struggling carbon-capture plant is only one example of the larger irony, which is that we don't know what an appropriate amount is to spend on carbon capture, because we don't know exactly, or even approximately, what it will cost if we don't, and who will pay.Probably the poorer among us will pay the most, but nobody can be sure.(There's a lot of very expensive real estate on coasts around the world, and sometimes I wonder if that influences the wealthy class to support anti-global-warming efforts as much as they do.)
The time factor is a problem in all this as well.Nearly all forecasts of global-warming tragedies are long-term things with timelines measured in many decades.That is good in the sense that we have a while to figure out what to do.But in terms of making economic decisions that balance profit against loss—which is what all private firms have to do—such long-run and widely distributed problems are chimerical and can't be captured by any reasonable accounting system.Try to put depreciation on an asset you plan to own from 2050 to 2100 on your income-tax return, and see how far you get.
So the only alternative in many places for large-scale carbon capture to happen is by government fiat.A dictatorial government such as China's could do this tomorrow if it wanted to, but as the recent Paris climate-accord meeting showed, it doesn't want to—not for a long time yet, anyway.In a nominal democracy such as the United States, the political will is strong in some quarters, but the unilateral non-democratic way the present administration has been trying to implement carbon limits has run into difficulties, to say the least.
My sympathies to residents of Mississippi who face the prospect of higher electric bills when, and if, their carbon-capturing power plant goes online.Whatever else the project has done, it has revealed the problems involved in building a hugely expensive engineering project for a payoff that few of those living today may ever see.
Sources:The article "The Carbon Capture Fallacy" by David Biello appeared on pp. 58-65 of the January 2016 edition of Scientific American.
Changchun, China-based Gpixel has expanded its image sensor portfolio, now offering 9 standard products for machine vision and industrial applications:
Nikkei reports that Sony announces a structural reform and personnel changes that will take effect on Jan 1, 2015 (probably a typo, should be 2016). Sony Device Solution Business Group will be split in 3 organizations: "Automotive Division," "Module Division" and "Product Development Division."
Nikkei sees several reasons for the re-org:
The reorg will strengthen Sony image sensor business.
Sony can better address the fast growing automotive image sensor market where On Semi currently has the largest share.
By offering modules rather than image sensors, it becomes possible to better support customers that do not have capabilities to use tune and adapt image quality.
The other reasons for shifting focus from image sensors to camera modules might be an attempt to replicate Sharp camera module success, and an intention to capture a larger chunk of the imaging foodchain.
SystemPlus publishes reverse engineering of Fujitsu Iris Authentication Module IR Camera Module & IR LED. The module has been extracted from Fujitsu Arrows NX F-04G smartphone that uses iris scan as the next biometric login technology. Compared to fingerprint sensors, Fujitsu claims that the solution features a faster, safer and more secure authentication. It is also a cost effective solution due to the reuse of standard CIS and LED components. OSRAM is said to be the IR LED manufacturer and has designed this 810nm LED exclusively for this iris scan application (this sounds somewhat contradictory to the re-use statement).
GlobeNewsWire: Grand View Research estimates the global LiDAR market size at over $260M in 2014. The market is expected to grow to $944.3M by 2022 driven by improved automated processing ability in data processing and image resolution capabilities.
Nikkei publishes an article on Olympus IEDM 2015 presentation on stacked visible and IR sensor. While it mostly re-iterated the previously published data, some info is new:
"The laminated image sensor is made by combining (1) an image sensor equipped with an RGB color filter for visible light (top layer) and (2) a near-infrared image sensor (bottom layer). Each layer functions as an independent sensor and independently outputs video signals.
The visible-light image sensor is a backside-illuminated type, and its light-receiving layer (made of semiconductor) is as thin as 3μm. Each pixel measures 3.8 x 3.8μm, and the number of pixels is 4,224 x 240."
The California Department of Motor Vehicles (CDMV) has issued proposed regulations for self-driving cars (also known as autonomous vehicles, or AVs), and what they are planning wasn't all good news, at least to hopeful AV developers such as Google.Last Wednesday, the CDMV released a draft version of rules that would apply to AVs used not for experiments and tests (these have been allowed for some time already), but by paying customers.They are pretty restrictive.
For one thing, the CDMV doesn't want anybody selling AVs yet—only leasing.For another thing, a specially licensed person has to be in the vehicle whenever it's operating, and able to take over manual control at any time.These restrictions rule out some of the most commercially promising potential applications for AVs, namely, driverless delivery vehicles.In its defense, the CDMV says that more testing is needed before such vehicles can be let loose on California freeways.And having driven on California freeways myself, I have to say they may have a point.
You can't blame the CDMV for being cautious.So far, the testing Google and automakers such as Mercedes and Tesla Motors have done has not turned up any show-stopper problems with autonomous vehicle systems.But effects that don't show up in small-scale tests can raise their ugly heads later.I'm not a traffic engineer, but there may be new types of problems that don't arise until the percentage of AVs on the road rises above a certain threshold.Despite all the manufacturers' efforts, AVs will act differently than human-driven cars, and depending on the programming, sensor layout, and other factors, there may be some unknown interactions, perhaps between cars of different makes, that will lead to weird and possibly hazardous problems that nobody could have suspected in advance.We simply don't know.So going slow in the largest automotive market of any state is perhaps a good thing.
On the other hand, history shows that government restrictions on new technology can quickly become absurd and even obstruct progress.Historians of the automobile are familiar with the "red flag laws" that the English Parliament enacted in the latter part of the 1800s.A typical law of this type required any "powered locomotive" on a public road to be accompanied by a person walking at least sixty yards (55 m) ahead of the vehicle, holding a red flag to be used as a signal to the operator to halt, and also to warn passersby of the machine's approach.Despite rumors that these laws were passed specifically to slow down the spread of self-powered passenger vehicles, they were actually aimed at steam tractors, which were mobile steam engines used to operate agricultural machinery.Steam tractors were developed as early as the 1860s, and the larger ones could do considerable damage to the roads of the day and frighten horses, so the regulations were appropriate at the time they were first passed.
However, when the newer, smaller passenger automobiles of the 1890s came along, the 4-miles-per-hour speed limits and other restrictions that were appropriate for steam tractors made little sense for autos, and it took some time for popular demand and pressure from automakers to change the red-flag laws.Something similar happened in a few U. S. states, but by 1900 most red-flag laws had been repealed or transformed into regulations more suitable for internal-combustion cars.
There are a couple of lessons here for what could happen next with regard to AV regulations.
First, we should expect some overreacting on the part of government regulators.No regulator I know of ever got fired for being too vigilant.Unfortunately, very few regulators get fired for not being vigilant enough, either, but the tendency of a bureaucracy whose mission is to regulate an industry, is to do more than necessary rather than less, up to the limit of the resources the regulator has at hand.Some commentators have said that what's bad for California is going to be good for Texas, which has taken a much more laissez-faire attitude toward AV experiments by Google and others.So we can thank what remnants of federalism remain in the U. S. for the fact that if one state passes excessively restrictive laws on an activity, companies can simply pull up stakes and go to a more friendly state.
The second lesson is more subtle, but has deeper and broader implications.It has to do with the gradual but pervasive spread of what is called "administrative law."To explain this problem, we need another historical detour.
Those familiar with the U. S. Constitution know that the powers of the federal government were purposely divided into three parts:the legislative branch for making the laws on behalf of the people it represents, the executive branch for enforcing the laws, and the judicial branch for judging whether citizens have violated the laws.This was done in reaction to the so-called "prerogative" that the English kings of the 1600s and earlier liked to exercise.In those bad old days, a king could haul off and make a law (legislative power), have his royal officers drag a subject in off the street (executive power), and pass judgment on whether the guy had broken the King's law (judicial power).Combining these distinct powers in one person was a great way to encourage despotism and tyranny.The authors of the U. S. Constitution had enough of that, thank you, so they strictly divided the operations of government into three distinct branches corresponding to the three basic functions of government, and made sure that new laws could be originated only by representatives elected by the people.
But over the last century or so, the dam holding back government by prerogative has sprung lots of leaks in the form of administrative laws.Nobody elects anyone who serves in the California Department of Motor Vehicles.It's just a bunch of bureaucrats who can make up regulations (legislate), pronounce penalties for violation of those regulations (execute), and in some cases even decide on whether a party is guilty or innocent of violating the regulations (judge).Yes, the California Senate, a representative body, asked the CDMV to do this, but in turning over the power to make laws to the CDMV, the Senate abdicated its legislative function and handed it over to a non-representative body.
This is an oversimplified version of a huge and pervasive issue, but once you understand the nature of the problem, you can see versions of it everywhere, especially in the alphabet soup of federal agencies:OSHA, FDA, FCC, etc.At least in the case of the red-flag laws, it was Parliament itself which passed the laws, and which modified them in response to public demand when the time came.But if the voters of California don't like what the CDMV does, they don't have a lot of options.
Perhaps the streets of Austin will see lots of consumer-owned AVs before you can find any in Los Angeles.That's fine with me, as long as they drive at least as well as the average Texas driver.And that shouldn't be too hard.
CBS publishes an interview with Apple management team, that includes a part about Apple camera team and its head Graham Townsend:
One of the most complex engineering challenges at Apple involves the iPhone camera, the most used feature of any Apple product. That's the entire camera you're looking at in my hand.
Graham Townsend: There's over 200 separate individual parts in this-- in that one module there.
Graham Townsend is in charge of a team of 800 engineers and other specialists dedicated solely to the camera. He showed us a micro suspension system that steadies the camera when your hand shakes.
Graham Townsend: This whole sus-- autofocus motor here is suspended on four wires. And you'll see them coming in. And here we are. Four-- These are 40-micron wires, less than half a human hair's width. And that holds that whole suspension and moves it in X and Y. So that allows us to stabilize for the hand shake.
In the camera lab, engineers calibrate the camera to perform in any type of lighting.
Graham Townsend: Go to bright bright noon. And there you go. Sunset now. There you go. So, there's very different types of quality of lighting, from a morning, bright sunshine, for instance, the noonday light. And then finally maybe--
CBS: Sunset, dinner--
Graham Townsend: We can simulate all those here. Believe it or not, to capture one image, 24 billion operations go on.
Filmmaker IQ publishes a short video lecture on image sensors. While not perfectly accurate in some parts, it's amazing how much info one can squeeze in a 13 min video:
Yole publishes an interview with a Heimann Sensor CEO Joerg Schieferdecker. Heimann Sensor is a 14-year old German IR image sensor manufacturer. Schieferdecker talks about a number of emerging thermal IR imaging applications, such a contactless thermometer embedded into a mobile phone:
"We are working to develop such a spot thermometer within Heimann Sensor. Many players in the mobile phone business are looking at these devices but we think that two more years of development are needed in order to get them working. First, single pixel sensors will be used and then arrays will be introduced into the market. The remaining issues are significant. For example, getting a compact device, within a thin ‘z budget’, is very challenging and needs further development at device and packaging level. In the same way, heat shock resistance is a challenge at the moment.
The applications of remote temperature sensing are just huge. Fever measurement, checking outdoor temperatures from your mobile phone, water temperature measurement for a baby’s bath or bottle - the list is endless. Most require 1 or 2°C accuracy, which is already available, while body temperature needs further development to reach 0.2°C accuracy.
We expect to see the first mobile phone with this feature around Christmas 2017."
BDTI publishes an article on the next generation Cadence Tensilica Vision P5 Processor IP featuring lower power and higher speed:
The base P5 core is said to be less than 2 mm2 in size in a 16nm process, with minor additional area increments for the optional FPU and cache and instruction memories. Lead customers have been evaluating and designing in Vision P5 for several months, and the core is now available for general licensing.
Belgian local DSP Valley Newsletter publishes an article about Caeleste (see page 6). Few quotes:
"Caeleste was created in December 2006, by a few people around the former CTO of Fillfactory/Cypress. After an organic growth start, Caeleste is now 22 persons large and fast growing with a CAGR of 30-50%. It is specialized in the design and supply of custom specific CMOS Image Sensors (CIS). Caeleste is still 100% owned by its founders.
...In fact, Caeleste has almost the largest CIS design team in Europe, exclusively devoted to customer specific CMOS image sensors."
Some of Caeleste R&D projects:
Caeleste has the world record in low noise imaging: We have proven a noise limit of 0.34 e-rms; this implies that under low flux conditions (e.g. in astronomy) each and every photon can be counted and that the background is completely black, provided that the dark current is low enough.
Caeleste has also designed a dual color X-ray imager, which allows a much better discrimination and diagnosis of cancerous tissue than the conventional grey scale images.
Caeleste has also its own patents for 3D imaging, based on Time of Flight (ToF) operation. This structure allows the almost noise free accumulation of multiple laser pulses to enable accurate distance measurements at long distance or with weak laser sources.
Oppenheimer publishes December 14, 2015 report on TowerJazz business talking about the CIS part of it, among many other things:
"CMOS Image Sensors (CIS). Approximately 15% of sales, ex-Panasonic. Tower offers advanced CMOS image sensor technology for use in the automotive, industrial, medical, consumer, and high-end photography markets. Tower estimates the silicon portion of the CMOS image sensor market to be a ~$10B TAM, with ~$3B potentially being served via foundry offerings. Today, roughly two-thirds of this market serves the cellular/smartphone camera market; however, the migration to image-based communication across the automotive, industrial, security and IoT markets is expanding the applications for Tower's CIS offerings. Tower is tracking toward 35% Y/Y growth in this segment in 2015, outpacing the industry's 9% compounded growth rate, as the company has gained traction in areas previously not served by specialty foundries.
Tower has IP related to highly customized pixels, which lend the technology to a wide variety of applications. We see opportunities in automotive, security/surveillance, medical imaging, and 3D gesture control driving sustainable growth in this segment long term."
Oppenheimer: "We expect image sensor growth to outpace the overall optoelectronics segment, and forecast growth at a 6% CAGR from 2014 to 2019 to approximately $13.3B, from $10.4B in 2014"
At the Paris Agreement, nations committed to strengthen the global response to the threat of climate change by holding the increase in the global average temperature to well below 2°C above pre-industrial levels and to pursue efforts to limit the temperature increase to 1.5°C above pre-industrial levels.
How much have temperatures risen already? As illustrated by above image, NASA data show that during the three-month period from September through November 2015, it was ~1°C warmer than it was in 1951-1980 (i.e the baseline).
A polynomial trend based on the data from 1880 to 2015 for these three months indicates that a temperature rise of 1.5°C compared to the baseline will be reached in the year 2024.
Let's go over the calculations. The trendline shows it was ~0.3°C colder in 1900 compared to the baseline. Together with the current ~1°C rise, that implies that since 1900 there's been a rise of 1.3°C compared to the baseline. This makes that another rise of 0.2°C by 2024, as pointed at by the trendline, would result in a joint rise in 2024 of 1.5°C compared to the baseline.
The situation is even more worse than this. The Paris Agreement seeks to avoid a temperature increase of 1.5°C above pre-industrial levels. When we include temperature rises from pre-industrial levels to the year 1900, it becomes evident that we have already surpassed a rise of 1.5°C since pre-industrial levels. This is illustrated by above image, earlier added at How much time is there left to act? (see notes there) and by the graph below, from a recent post by Michael Mann, who adds that ~0.3°C greenhouse warming had already taken place by the year 1900.
~0.3C greenhouse warming had already taken place by 1900, and ~0.2C warming by 1870
Let's add things up again. A rise of ~0.3°C before 1900, a further rise of 0.3°C from 1900 to the baseline (1951-1980) and a further rise of ~1°C from the baseline to date, together that adds up to a rise of ~1.6°C from pre-industrial levels.
In other words, we have already surpassed a rise of 1.5°C from pre-industrial levels by 0.1°C.
The trendline indicates that a further rise of 0.5°C will take place by the year 2030, i.e. that without comprehensive and effective action, it will be 2°C warmer than pre-industrial levels before the year 2030. Full wrath of emissions yet to come
The full wrath of global warming is yet to come and the situation is even more threatening than pictured above, for the following reasons:
Half of global warming has until now been masked by aerosols, particularly sulfates that are emitted when some of the dirtiest fossil fuels are burnt, such as coal and bunker oil. As we make the necessary shift to clean energy, the masking effect that comes with those emissions will disappear.
As Ricke and Caldeira point out, the carbon dioxide that is released now will only reach its peak impact a decade from now. In other words, we are yet to experience the full wrath of the carbon dioxide emitted over the past decade.
The biggest threat comes from temperature peaks. People in some parts of the world will be hit harder, especially during summer peaks, as discussed in the next section of this post. As temperatures rise, the intensity of such peaks will increase. The image on the right illustrates this with a forecast for December 25, 2015, showing extreme weather for North America, with temperatures as low as 30.6°F or -0.8°C in California and as high as 71.5°F or 22°C in North Carolina.
Feedbacks such as rapid albedo changes in the Arctic and large amounts of methane abruptly released from the Arctic Ocean seafloor could dramatically accelerate the temperature rise. Furthermore, water vapor will increase by 7% for every 1°C warming. Water vapor is one of the strongest greenhouse gases, so increasing water vapor will further contribute to a non-linear temperature rise. The resulting temperature rises threaten to be non-linear, as discussed in the final section of this post.
Situation even worse for some
Such temperature rises will hit some people more than others. For people living on the Northern Hemisphere, the outlook is worse than for people on the Southern Hemisphere.
Similarly, the outlook is worse for people living in regions that are already now experiencing high temperatures during the summer peaks. As said, as temperatures rise, the intensity of such peaks will increase.
Feedbacks in the Arctic
The image below, from an earlier post, depicts the impact of feedbacks that are accelerating warming in the Arctic, based on NASA data up to November 2013, and their threat to cause runaway global warming. As the image shows, temperatures in the Arctic are rising faster than elsewhere in the world, but global warming threatens to catch up as feedbacks start to kick in more. The situation obviously has deteriorated further since this image was created in November 2013.
[ click on image at original post to enlarge ]
Above image, from an earlier post, depicts the impact of feedbacks that are accelerating warming in the Arctic, based on NASA data up to November 2013. The image shows that temperatures in the Arctic are rising faster than elsewhere in the world. Global warming threatens to catch up as feedbacks start to kick in more, triggering runaway global warming. The situation obviously has deteriorated further since this image was created in November 2013.
The image below shows sea surface temperature anomalies on the Northern Hemisphere in November.
The image below gives an indication of the high temperatures of the water beneath the sea surface. Anomalies as high as 10.3°C or 18.5°F were recorded off the east coast of North America (green circle on the right panel of the image below) on December 11, 2015, while on December 20, 2015, temperatures as high as 10.7°C or 51.3°F were recorded near Svalbard (green circle on the right panel of the image below), an anomaly of 9.3°C or 16.7°F.
This warm water is carried by the Gulf Stream into the Arctic Ocean, threatening to unleash huge amounts of methane from its seafloor. The image below illustrates the danger, showing huge amounts of methane over the Arctic Ocean on December 10, 2015.
Methane is released over the Arctic Ocean in large amounts, and this methane is moving toward the equator as it reaches high altitudes. The image below illustrates how methane is accumulating at higher altitudes.
Above image shows that methane is especially prominent at higher altitudes recently, having pushed up methane levels by an estimate average of 9 ppb or some 0.5%. Annual emissions from hydrates were estimated to amount to 99 Tg annually in a 2014 post (image below).
An additional 0.5% of methane represents an amount of some 25 Tg of methane. This comes on top of the 99 Tg of methane estimated in 2014 to be released from hydrates annually.
The situation is dire and calls for comprehensive and effective action as described at the Climate Plan.
During the three-month period from September through November 2015, it was 1°C warmer than it was in 1951-1980,... Posted by Sam Carana on Wednesday, December 16, 2015