When is risk the highest?
Hat Tip: Vetri Subramaniam
Awesome quote by Peter Bernstein
Also at Alpha Ideas:
Weekend Mega Linkfest: June 30, 2018
Here, crowds of camera phone-toting tourists cooed and clapped at Pepper, an adorable 4-foot-tall robot with a high-pitched voice and a tablet screen strapped to its chest, during a recent test run. HSBC executives are posting seven of the machines, created by SoftBank Robotics, in the lobby of their Fifth Avenue branch starting Tuesday to interact with customers and signal that the London-based bank is serious about technology.
“How may I help you?” Pepper asks in a cheerful voice, urging onlookers to select among a half-dozen options, including tutorials on self-service channels like mobile banking and ATMs.
In one demonstration, after pitching a credit card and summoning a human colleague to help close the deal, Pepper helped a user kill time by dancing hypnotically to a techno beat. The robot also poses for selfies and tells jokes....MOREPepper, what have they done to you?
Schrödinger’s kittens have never been very cute, and the latest litter is no exception. Images of nebulous clouds of ultracold atoms or microscopic strips of silicon are unlikely to go viral on the internet. All the same, these exotic objects are worth heeding, because they show with unprecedented clarity that quantum mechanics is not just the physics of the extremely small.
“Schrödinger’s kittens,” loosely speaking, are objects pitched midway in size between the atomic scale, which quantum mechanics was originally developed to describe, and the cat that Erwin Schrödinger famously invoked to highlight the apparent absurdity of what that theory appeared to imply. These systems are “mesoscopic” — perhaps around the size of viruses or bacteria, composed of many thousands or even billions of atoms, and thus much larger than the typical scales at which counterintuitive quantum-mechanical properties usually appear. They are designed to probe the question: How big can you get while still preserving those quantum properties?
To judge by the latest results, the answer is: pretty darn big. Two distinct types of experiments — both of them carried out by several groups independently — have shown that vast numbers of atoms can be placed in collective quantum states, where we can’t definitely say that the system has one set of properties or another. In one set of experiments, this meant “entangling” two regions of a cloud of cold atoms to make their properties interdependent and correlated in a way that seems heedless of their spatial separation. In the other, microscopic vibrating objects were maneuvered into so-called superpositions of vibrational states. Both results are loosely analogous to the way Schrödinger’s infamous cat, while hidden away in its box, was said to be in a superposition of live and dead states.
The question of how the rules of quantum mechanics turn into the apparently quite different rules of classical mechanics — where objects have well-defined properties, positions and paths — has puzzled scientists ever since quantum theory was first developed in the early 20th century. Is there some fundamental difference between large classical objects and small quantum ones? This conundrum of the so-called quantum-classical transition was highlighted in iconic fashion by Schrödinger’s thought experiment.
The poor cat is a much-misunderstood beast. Schrödinger’s point was not, as often implied, the apparent absurdity of quantum mechanics if extrapolated up to the everyday scale. The cat was the product of correspondence between Schrödinger and Albert Einstein, after Einstein had criticized the interpretation of quantum mechanics championed by the Danish physicist Niels Bohr and his colleagues.
Bohr argued that quantum mechanics seems to force us to conclude that the properties of quantum objects like electrons do not have well-defined values until we measure them. To Einstein, it seemed crazy that some element of reality depends on our conscious intervention to bring it into being. With two younger colleagues, Boris Podolsky and Nathan Rosen, he presented a thought experiment in 1935 that appeared to make that interpretation impossible. The three of them (whose work now goes by the collective label EPR) noted that particles can be created in states that must be correlated with each other, in the sense that if one of them has a particular value for some property, the other must have some other particular value. In the case of two electrons, which have a property called spin, one spin might point “up” while the other electron’s spin points “down.”
In that case, according to Einstein and his colleagues, if Bohr is right and the actual directions of the spins are undetermined until you measure them, then the correlation of the two spins means that measuring one of them instantly fixes the orientation of the other — no matter how far away the particle is. Einstein called this apparent connection “spooky action at a distance.” But such a phenomenon should be impossible, because Einstein’s theory of special relativity shows that no influence can propagate faster than light.
Schrödinger called this correlation between the particles “entanglement.” Experiments since the 1970s have shown that it is a real quantum phenomenon. But this doesn’t mean that quantum particles can somehow influence one another instantly across space through Einstein’s spooky action. It’s better to say that a single particle’s quantum properties are not necessarily determinate at one fixed place in space, but may be “nonlocal”: fully specified only in relation to another particle elsewhere, in a manner that seems to undermine our intuitive notion of space and distance....MUCH MORE
On 27th June 1858, Jane Welsh Carlyle writes to Mary Russell:
“It is so long since I wrote, and I have been so bothered and bewildered in the interval, that I can’t recollect if it is your turn or my own to write… I can only say I have had plenty of excuse for all my sins of omission of late weeks. First, my dear, the heat has really been nearer killing me than the cold. London heat! Nobody knows what that is till having tried it; so breathless, and sickening, and oppressive, as no other heat I have ever experienced is!”
The continuing heat had a disastrous effect on the River Thames, then the only outlet for all of the capital’s untreated sewage. What became known as “The Great Stink” made business in parliament intolerable.
Three days later, the Times reported the abandonment of a committee:
“A sudden rush from the room took place, foremost among them being the Chancellor of the Exchequer [Benjamin Disraeli] who, with a mass of papers in one hand and with his pocket handkerchief clutched in the other and applied closely to his nose, with body half bent, hastened in dismay from the pestilential odour, followed closely by Sir James Graham, who seemed to be attacked by a sudden fit of expectoration; Mr Gladstone also paid particular attention to his nose… The other members of the committee also precipitately quitted the pestilential apartment, the disordered state of their papers, which they carried in their hands, showing how imperatively they had received notice to quit.”
The “Great Stink” concentrated the minds of the legislators. On 2nd August, parliament passed an act allowing the construction of Joseph Bazalgette’s ambitious scheme for enclosed sewers for the capital.
1976 had the hottest summer average temperature in the UK since records began. Few places registered more than half their average summer rainfall. Denis Howell (right), the Labour Minister for Sport, was appointed “Minister for Drought” in 1976 and recalls in his memoirs:
“It is widely believed that the rains came immediately upon my appointment as ‘Minister for Drought’—and that it has never stopped raining since! The fact is we went a further ten days without rain and in the West Country were down to 25 days’ supply. The situation was growing desperate in parts of Yorkshire and it was arranged that I would travel there for a ceremonial turning-on of a standpipe put up on one of the housing estates… As I turned on the stopcock the rains came, only a few spots at first, but soon we were deluged. The reservoirs were empty and the ground was dried up. It was going to take weeks for the water to soak through the earth and fill the rivers and reservoirs. My appeals for continued water-saving, delivered as I stood in the pouring rain, were amusing but necessary. We had got through a crisis far more serious than most people appreciated. However, the emergency measures we had embarked upon did not need to be brought into operation so we were never able to test the efficiency of a two-line six-inch pipe system laid down 70 miles of the fast lane of the M5 motorway between Bristol and Exeter.”
When the weather broke there was widespread flooding and Howell was appointed Minister for Floods. During the harsh winter of 1978-9, he was appointed Minister for Snow. The relevant chapter of his autobiography is entitled “Man for All Seasons.”
The political economy of digitization is a fraught topic. Scholars and policymakers have disputed the relative merits of centralization and decentralization. Do we want to encourage massive firms to become even bigger, so they can accelerate AI via increasingly comprehensive data collection, analysis, and use? Or do we want to trust-bust the digital economy, encouraging competitors to develop algorithms that can “learn” more from less data? I recently wrote on this tension, exploring the pro’s and con’s of each approach.
However, there are some ways out of the dilemma. Imagine if we could require large firms to license data to potential competitors in both the public and private sectors. That may sound like a privacy nightmare. But anonymization could allay some of these concerns, as it has in the health care context. Moreover, the first areas opened up to such mandated sharing may not even be personal data. Sharing the world’s best mapping data beyond the Googleplex could unleash innovation in logistics, real estate, and transport. Some activists have pushed to characterize Google’s trove of digitized books as an essential facility, which it would be required to license at fair, reasonable, and non-discriminatory (FRAND) rates to other firms aspiring to categorize, sell, and learn from books. Fair use doctrine could provide another approach here, as Amanda Levendowski argues.
In a recent issue of Logic, Ben Tarnoff has gone beyond the essential facilities argument to make a case for nationalization. Tarnoff believes that nationalized data banks would allow companies (and nonprofits) to “continue to extract and refine data—under democratically determined rules—but with the crucial distinction that they are doing so on our behalf, and for our benefit.” He analogizes such data to natural resources, like minerals and oil. Just as the Norwegian sovereign wealth fund and Alaska Permanent Fund socialize the benefits of oil and gas, public ownership and provision of data could promote more equitable sharing of the plenitude that digitization ought to enable.
Many scholars have interrogated the data/oil comparison. They usually focus on the externalities of oil use, such as air and water pollution and climate change. There are also downsides to data’s concentration and subsequent dissemination. Democratic control will not guarantee privacy protections. Even when directly personally identifiable information is removed from databases, anonymization can sometimes be reversed. Both governments and corporations will be tempted to engage in “modulation”—what Cohen describes as a pervasive form of influence on the beliefs and behaviors of citizens. Such modulation is designed to “produce a particular kind of subject[:] tractable, predictable citizen-consumers whose preferred modes of self-determination play out along predictable and profit-generating trajectories.” Tarnoff acknowledges this dark possibility, and I’d like to dig a bit deeper to explore how it could be mitigated.
Reputational Economies of Social Credit and Debt
Modulation can play out in authoritarian, market, and paternalistic modes. In its mildest form, such modulation relies on nudges plausibly based on the nudged person’s own goals and aspirations—a “libertarian paternalism” aimed at making good choices easier. In market mode, the highest bidder for some set of persons’ attention enjoys the chance to influence them. Each of these are problematic, as I have noted in articles and a book. However, I think that authoritarian modulation is the biggest worry we face as we contemplate the centralization of data in repositories owned by (or accessible to) governments. China appears to be experimenting with such a system, and provides some excellent examples of what data centralizers should constitutionally prohibit as they develop the data gathering power of the state.
The Chinese social credit system (SCS) is one of the most ambitious systems of social control ever proposed. Jay Stanley, a senior policy analyst at the ACLU’s Speech, Privacy & Technology Project, has summarized a series of disturbing news stories on China’s “Planning Outline for the Construction of a Social Credit System.” As Stanley observes, “Among the things that will hurt a citizen’s score are posting political opinions without prior permission, or posting information that the regime does not like.” At least one potential version of the system would also be based on peer scoring. That is, if an activist criticized the government or otherwise deviated from prescribed behavior, not only would her score go down, but her family and friends’ scores would also decline. This algorithmic contagion bears an uncomfortable resemblance to theories of collective punishment.
Admittedly, at least one scholar has characterized the SCS as less fearsome: more “an ecosystem of initiatives broadly sharing a similar underlying logic, than a fully unified and integrated machine for social control.” However, heavy-handed application of no-travel and no-hotel lists in China do not inspire much confidence. There is no appeal mechanism—a basic aspect of due process in any scored society.
The SCS’s stated aim is to enable the “trustworthy to roam everywhere under heaven while making it hard for the discredited to take a single step.” But the system is not even succeeding on its own terms in many contexts. Message boards indicate that some citizens are gaming the SCS’s data feeds. For example, a bank may send in false information to blackball its best customer, in order to keep that customer from seeking better terms at competing banks. To the extent the system is a black box, there is no way for the victim to find out about the defamation....MUCH MORE
Belgian tanker company Euronav has purchased the Ultra Large Crude Carrier Seaways Laura Lynn, making it the proud owner of the only two ULCCs in the world capable of holding up 3 million barrels of oil.
Euronav Tankers acquired the Seaways Laura Lynn from Oceania Tanker Corporation, a subsidiary of International Seaways, for $32.5 million.
Euronav has renamed the ULCC ‘Oceania’ and re-registered it under the Belgian flag.
The Seaways Laura Lynn was built in 2003 and has a deadweight of a whopping 441,561 metric tons. The vessel measures 380 meters in length and 68 meters in beam, with a draft of 24 meters.
The only other ULCC currently in the world is Euronav’s 442,470 dwt TI Europe.Big yes but still nothing like the ones the Japanese and French built back in the day.
The two remaining ULCCs were among a series of four TI-class supertankers built in 2002-2003 by DSME for Hellespont Group of Greece. Euronav acquired the vessels in 2004, but later sold one (Overseas Laura Lynn) to Oceania Tanker Corporation in March 2015. The two other vessels in the series have since been converted to Floating Storage and Offloading (FSO) vessels and remain under the ownership Euronav subsidiaries....MORE
Here's how it stacked up against some other big boats (and buildings):...Seawise Giant, later Happy Giant, Jahre Viking, Knock Nevis, Oppama, and finally Mont, was a ULCC supertanker that was the longest ship ever built. It possessed the greatest deadweight tonnage ever recorded. Fully loaded, its displacement was 657,019 tonnes (646,642 long tons; 724,239 short tons), the heaviest ship of any kind, and with a laden draft of 24.6 m (81 ft), it was incapable of navigating the English Channel, the Suez Canal or the Panama Canal. Overall, it was generally considered the largest ship ever built. It was sunk during the Iran–Iraq War, but was later salvaged and restored to service. It was last used as a floating storage and offloading unit (FSO) moored off the coast of Qatar in the Persian Gulf at the Al Shaheen Oil Field....MORE
Overall world debt in the last year or two is at its all-time high as a share of world GDP. But there is common pattern that as countries grow and their financial markets develop, their level of debt also tends to rise. Perhaps even more interesting is that the importance of the components of that debt have been shifting. During and after the Great Recession, government borrowing was the main driver of rising global debt. But corporate borrowing has become more important...MORE
Moreover, this corporate borrowing has two new traits. One is that as bank regulators all over the globe have tightened up, this rise in corporate borrowing tends to take the form of bonds rather than bank loans. The other interesting trait is that this rise in corporate borrowing around the world can be traced back to developing economies--and especially to China.
Susan Lund, Jonathan Woetzel, Eckart Windhagen, Richard Dobbs, and Diana Goldshtein of the McKinsey Global Institute provide an overview in their June 2018 discussion paper, Rising Corporate Debt: Peril or Promise? An overview of the report is here; the full report is here. They write:
"In a departure from the past, most of the growth in corporate debt has come from developing countries, in particular China. Companies in advanced economies accounted for just 34 percent or $9.9 trillion of the growth in global corporate debt since 2007, while developing countries accounted for 66 percent or $19.2 trillion. Since 2007, China’s corporate debt has increased by $15 trillion, or more than half of global corporate debt growth. As a share of GDP, China’s corporate debt rose from 97 percent of GDP in 2007 to 163 percent in 2017, one of the highest corporate debt ratios in the world apart from small financial centers that attract offshore companies. The growth in corporate debt in China is mainly associated with a construction sector that increased its leverage as the housing market boomed. Today, 30 to 35 percent of corporate debt in China is associated with construction and real estate. ..."A relatively new feature of the debt landscape in recent years has been a shift in corporate borrowing from loans to bonds. Given the growing pressure on banks to meet new capital and liquidity standards, global nonfinancial corporate loans outstanding have been growing by only 3 percent annually on average since 2007 to stand at around $55 trillion in 2017. However, the share of global corporate debt in the form of bonds has nearly doubled, and the value of corporate bonds outstanding has grown 2.7 times since 2007. This is a positive trend, leading to a diversification of corporate financing. However, we also find risks."Here are a couple of summary figures for nonfinancial corporate debt by country. The countries are ranked by total corporate debt as a share of GDP...
Last week the Fed again raised its benchmark Federal Funds rate target, now at 2%, up from the 0.25% rate that had been maintained steadily from late 2008 until late 2015, when the Fed, after a few false starts, finally worked up the courage — or caved to the pressure of the banks and the financial community — to start raising rates. The Fed also signaled its intention last week to continue raising rates – presumably at 0.25% increments – at least twice more this calendar year.Some commentators have worried that rising short-term interest rates are outpacing increases at the longer end, so that the normally positively-sloped yield curve is flattening. They point out that historically flat or inverted yield curves have often presaged an economic downturn or recession within a year.What accounts for the normally positive slope of the yield curve? It’s usually attributed to the increased risk associated with a lengthening of the duration of a financial instrument, even if default risk is zero. The longer the duration of a financial instrument, the more sensitive the (resale) value of the instrument to changes in the rate of interest. Because risk falls as the duration of the of the instrument is shortened, risk-averse asset-holders are willing to accept a lower return on short-dated claims than on riskier long-dated claims.If the Fed continues on its current course, it’s likely that the yield curve will flatten or become inverted – sloping downward instead of upward – a phenomenon that has frequently presaged recessions within about a year. So the question I want to think through in this post is whether there is anything inherently recessionary about a flat or inverted yield curve, or is the correlation between recessions and inverted yield curves merely coincidental?The beginning of wisdom in this discussion is the advice of Scott Sumner: never reason from a price change. A change in the slope of the yield curve reflects a change in price relationships. Any given change in price relationships can reflect a variety of possible causes, and the ultimate effects, e.g., an inverted yield curve, of those various underlying causes, need not be the same. So, we can’t take it for granted that all yield-curve inversions are created equal; just because yield-curve inversions have sometimes, or usually, or always, preceded recessions doesn’t mean that recessions must necessarily follow once the yield curve becomes inverted.Let’s try to sort out some of the possible causes of an inverted yield curve, and see whether those causes are likely to result in a recession if the yield curve remains flat or inverted for a substantial period of time. But it’s also important to realize that the shape of the yield curve reflects a myriad of possible causes in a complex economic system. The yield curve summarizes expectations about the future that are deeply intertwined in the intertemporal structure of an economic system. Interest rates aren’t simply prices determined in specific markets for debt instruments of various durations; interest rates reflect the opportunities to exchange current goods for future goods or to transform current output into future output. Interest rates are actually distillations of relationships between current prices and expected future prices that govern the prices and implied yields at which debt instruments are bought and sold. If the interest rates on debt instruments are out of line with the intricate web of intertemporal price relationships that exist in any complex economy, those discrepancies imply profitable opportunities for exchange and production that tend to eliminate those discrepancies. Interest rates are not set in a vacuum, they are a reflection of innumerable asset valuations and investment opportunities. So there are potentially innumerable possible causes that could lead to the flattening or inversion of the yield curve.For purposes of this discussion, however, I will focus on just two factors that, in an ultra-simplified partial-equilibrium setting, seem most likely to cause a normally upward-sloping yield curve to become relatively flat or even inverted. These two factors affecting the slope of the yield curve are the demand for liquidity and the supply of liquidity.An increase in the demand for liquidity manifests itself in reduced current spending to conserve liquidity and by an increase in the demands of the public on the banking system for credit. But even as reduced spending improves the liquidity position of those trying to conserve liquidity, it correspondingly worsens the liquidity position of those whose revenues are reduced, the reduced spending of some necessarily reducing the revenues of others. So, ultimately, an increase in the demand for liquidity can be met only by (a) the banking system, which is uniquely positioned to create liquidity by accepting the illiquid IOUs of the private sector in exchange for the highly liquid IOUs (cash or deposits) that the banking system can create, or (b) by the discretionary action of a monetary authority that can issue additional units of fiat currency.Let’s consider first what would happen in case of an increased demand for liquidity by the public. Such an increased demand could have two possible causes. (There might be others, of course, but these two seem fairly commonplace.)...
It’s that time again. At the end of this month, the Federal Reserve has over $30 billion of notes maturing. I won’t rehash what this might mean for the market, rather for those not familiar, I ask you to go read Pink Tickets On QT Days.
Let’s do a quick recap at what happened at the previous big QT day - last month end - May 31st, 2018:
Another large QT maturity day, and another down day in spooz.
The streak is alive and well. QT days have been rather large down days for the S&P 500.HT: ZH
And this month’s maturity is another big one - over $30 billion....MORE
Since our launch in 1815 our commitment has always been the same, to help people plan for their future. pic.twitter.com/W7yqAiV8Yz— Scottish Widows (@ScottishWidows) June 3, 2018
Today the Silver Screen Bottling Company announced a new line of officially licensed Star Trek-inspired spirits, kicking off with the launch of a James T. Kirk Straight Bourbon Whiskey. The new Kirk whiskey is described as “an artisan bourbon that celebrates Captain Kirk’s bold spirit of adventure.”
The new Kirk Bourbon is a small batch release selected from choice barrels aged between 4-12 years. According to Silver Screen Bottling, the Star Trek whiskey “exhibits a depth and richness seen in only the finest examples of bourbon with notes of caramel, Asian 5 spice and pecan” which is “crafted with the highest respect for the whiskey and for the man that bears its name.”...
Digital currency sales jumped to $13.7 billion in the first five months of the year, nearly double the amount raised for the whole of 2017, according to a report released on Thursday by PwC’s strategy and consulting division Strategy& and Switzerland-based Crypto Valley Association.
Technology startups in the blockchain space around the world have raised funds by selling cryptocurrencies, or tokens directly to investors in initial coin offerings (ICOs), bypassing banks or venture capital firms as intermediaries.
Blockchain, the technology that underpins bitcoin and other digital currencies, is a digital ledger that provides a secure way of making and recording transactions.
This year’s virtual currency sales from 537 coin offerings topped last year’s total of $7.0 billion, and included mammoth offerings from Telegram, a messaging service founded by Russia-born entrepreneurs Pavel and Nikolai Durov in 2013, and Block.one’s EOS currency, the report said.
Telegram raised $1.7 billion without its tokens sold to the public. EOS, an infrastructure project for decentralized applications, raised more than $4 billion in a year-long token offering that started in the middle of last year....MORE
Global equities are posting impressive gains to close out the quarter. The Shanghai Composite rose nearly 2.2%, its strongest gain in a couple of years on signals from the PBOC that focus may shift to growth stabilization, which could facilitate easing of credit conditions. The onshore yuan snapped a six-day slide, while the offshore yuan rose for the first time in 12 sessions.
More broadly, the MSCI Asia Pacific Index ended a four-day slide to finish 0.9% higher, recouping a third of this week's losses. Nearly all markets in the region rose but Australia (-0.3%) and Thailand (-0.5%). Indonesian stocks rallied 2.3% even after the central bank surprised investors by a 50 bp rate hike. Many had expected a quarter-point move. The MSCI Asia Pacific Index rose one week in May and one week in June. It fell 3.8% over Q2, after slipping 0.6% in Q1. The benchmark rose in all four quarters last year.
European stocks are moving higher as well. Coming into today, the Dow Jones Stoxx 600 was up about 1.6% for the quarter and is tacking on another 1.2% today. All the major industry groups are advancing, led by information technology and financials. It is still off 1% on the week, and its the fifth week of the past six that the benchmark has moved lower.
There are two main developments in Europe today. First, a deal appears to have been struck on immigration. It is very much along the lines that Italy's new government advocated, and one that will likely be sufficient to ease the tension within Merkel's CDU/CSU alliance. Essentially, the agreement calls for increased border security, holding centers for asylum seekers, an expedited process to determine eligibility for asylum and overhaul the distribution of migrants when the gateway states, like Italy, Malta, and Greece, are overwhelmed. Illegal immigrants will be expelled.
The second development was the flash June CPI report. It was in line with expectations. The headline rate rose to 2.0% from 1.9%, while the core rate slipped to 1.0% from 1.1%. The rise in the headline rate likely stems from the increase in energy prices.
The euro, which had returned its recent lows yesterday (near $1.1525) is moving higher today. It took out yesterday's high in Asia to reach $1.1665. Although the high was reapproached after the CPI report, it stalled. The midweek high was a little above $1.1670, and the high for the week was near $1.1720. There was a large option struck at $1.15 (3 bln euros) which seems irrelevant now, and another nearly 900 mln euros struck at $1.1550 is not in play. However, the 2 bln euro option at $1.1600 could still impact activity in the North American morning.
Sterling eased to a new 2018 low yesterday near $1.3050, effectively meeting the objective of the double top pattern from the spring. It bounced off smartly, alongside the euro, in Asia. It reached almost $1.3185 in early Europe but turned down after the batch of data, which included a tick up in Q1 GDP estimate to 0.2% from 0.1%. Today's data for Q2 suggest the economy has done a bit better. The market appears to be pricing in about a 70% chance of a BOE rate hike in August. This is around twice the probability that was discounted at the end of May.
European bonds are mixed. Italy's 10-year yield is off eight basis points to bring this week's decline to 12 bp. Spain and Portugal's 10-year benchmark yields are three basis points lower, while the core yields are slightly firmer. The US and UK 10-year yields are 1-2 basis points higher but are still 2-3 lower on the week.
An unexpected decline in Japanese unemployment and a smaller than forecast decline in industrial output failed to stir investors. May's unemployment fell to 2.2%, but the mild wage pressure (~0.9% year-over-year) blunts the significance for many investors. Industrial production was expected to have fallen by at least 1% in May but instead slipped a mild 0.2%, which allowed the year-over-year pace improve to 4.2% from 3.4% in April.
The dollar is consolidating this week's small gains. Impressively consistent, the greenback has risen every week here in Q2 except two, for an overall gain of near 4.1%. Today there is a $466 mln option stuck at JPY110.50 expiring. Another $372 mln JPY110 option also will be cut....
If the U.S. were serious the request-for-proposal would be for six ships and they would have been started five to ten years ago.It appears the rumors were true.
China, a non-polar nation already has a small fleet of light and medium icebreakers and is rumored to have plans for a new medium with a 3-3.5 meter-thick-ice capability as a stepping-stone to a couple heavy icebreakers by the mid-to-late 2020's. They are serious about their Polar Silk Road.*
China National Nuclear Corporation on June 21st said bids are welcome from domestic yards to build the country’s first nuclear-powered icebreaker, newspaper Global Times reports.
The ship is said to be an “icebreaker support ship” indicating a multi-role purpose more than simply breaking the ice for other vessels in convoy. China’s current only ocean-going icebreaker, the “Xue Long” (Snow Dragon) is an icebreaking research vessel. Last summer, the ship sailed the entire Arctic rim with several stops en route where the on board scientists worked on different ice- and climate related research projects
Bidders are required to take part in research, appraisal, building and testing, as well as providing technology support to the operator.
Russia is today the only country in the world that operates a fleet of civilian nuclear powered vessels; four icebreakers and one container ship, all with Murmansk as homeport. Three new, even more powerful, nuclear-powered icebreakers are under construction.
China has experience in naval nuclear propulsion from a fleet of currently six military submarines of three different classes.
Song Zhongping, a military expert, said to Global Times the new icebreaker’s reactor can be applied to a nuclear-powered aircraft carrier once updated....MOREThat makes a nice jumping off point to the asterisked footnote from that March post:
States from outside the Arctic region do not have territorial sovereignty in the Arctic, but they do have rights in respect of scientific research, navigation, overflight, fishing, laying of submarine cables and pipelines in the high seas and other relevant sea areas in the Arctic Ocean, and rights to resource exploration and exploitation in the Area, pursuant to treaties such as UNCLOS and general international law. In addition, Contracting Parties to the Spitsbergen Treaty enjoy the liberty of access and entry to certain areas of the Arctic, the right under conditions of equality and, in accordance with law, to the exercise and practice of scientific research, production and commercial activities such as hunting, fishing, and mining in these areas....and is dedicating 1100 square feet of space on the latest planned icebreaker to laboratories.
If California lawmakers fail to boost consumer privacy rights by Thursday evening, a measure on the November ballot will.Earlier this morning The Intercept reported:
California lawmakers on Thursday are racing to pass a privacy bill — not simply because they can’t wait any longer to give consumers more rights, but so an impenetrable November ballot measure doesn’t beat them to it.
If Gov. Jerry Brown fails to sign the California Data Privacy Protection Act on Thursday by 5 p.m., a ballot measure with similar protections would remain on the November ballot and any changes to the law would be tough to do on their terms. By noon, it passed the Senate.
Alastair Mactaggart, a Bay Area developer who put over $1 million behind the initiative that he wrote, said he would pull it from the November ballot if it was signed into law on Thursday — the last day to do so. So bill spent the day racing through Assembly committees and the Senate floor to meet the deadline.
In addition to boosting consumers’ legal right to know what happens to their personal information, the measure would require businesses to disclose if it sells that information and outright stop if that consumer tells it to. It also prevents companies from charging fees for such requests or from discriminating against them.
While companies like Google, Facebook, Verizon, and Comcast contributed money to fight the ballot measure, they haven’t fought lawmakers on the proposed bill....MORE
The billionaire George Soros has found a new way to make money from personal-injury lawsuits.
Soros Fund Management is pushing into a branch of litigation finance that few hedge funds have entered. His family office is bankrolling a company that’s creating investment portfolios out of lawsuits, according to a May regulatory filing.
The development is the latest twist on the litigation funding market, which has drawn criticism for monetising and encouraging the lawsuit culture in the US. The firm Soros is backing, Mighty Group, bundles cash advances that small shops extend to plaintiffs in personal injury suits in return for a cut of future settlements. Mighty Group’s approach opens the door to another potential development: securitising individual lawsuit bets for sale to other investors.
“There are all the ingredients there to securitise these things,” said Adrian Chopin, a managing director at legal finance firm Bench Walk Advisors. “A diversified, granular pool with predictable outcomes. The problem is, you can’t yet get these things rated” by credit agencies.
Wall Street has been betting for a while on commercial litigation, which provides financing of big corporate suits with millions or even billions of dollars at stake. Soros is focused on the consumer side, where plaintiffs receive advances of $2,000 on average for legal claims typically tied to auto and construction accidents. The advances are used to cover personal expenses, such as medical bills and rent.
Soros along with Apollo Capital Management are among the first money managers to jump into this niche of the lawsuit- funding market. It offers steady and predictable returns, which historically have averaged about 20% a year at relatively low risk, said Chopin of Bench Walk.
“Everybody is looking for yield, and people are also looking for assets that are not correlated with the major equity and debt markets,” said Christopher Gillock, a managing director at Colonnade Advisors, an investment bank that specialises in financial services. “Litigation funding falls into that category.”The ABA Journal adds:
Joshua Schwadron, a co-founder of Mighty, declined to comment on the firm’s investors. Michael Vachon, a spokesman for Soros Fund Management, the billionaire’s New York-based family office, declined to comment....MORE
Hedge funds and private equity firms are jumping into another aspect of litigation finance with loans that finance mass tort cases against drug companies and medical device manufacturers.
One hedge fund getting involved is EJF Capital, which hopes to raise an additional $300 million for an investment vehicle for mass tort cases, the New York Times reports. Other established hedge funds have lent money to mass tort law firms, while “a slew of newer firms” that specialize in mass-tort lending are emerging.
According to the New York Times, litigation finance has been around for a long time, but it has mostly focused on financing complex, long-term commercial litigation. Lending money for consumer suits could be even more profitable....MORE
Amazon is expanding further into package delivery and promising to support a new wave of small business owners with the launch of a program that helps entrepreneurs start and run their own companies — delivering items purchased on Amazon.com in distinctive blue Prime-branded shirts and vans.*Livery—
It’s “the next big building block of our end-to-end supply chain,” said Dave Clark, the Amazon executive who oversees the worldwide delivery logistics infrastructure for the e-commerce giant, in an interview with GeekWire, after a preview of the service for reporters in Seattle on Wednesday.
Amazon is announcing the new “Delivery Service Partners” program tonight. It’s the company’s latest move to build its own alternative to (if not yet a replacement for) UPS, FedEx, the U.S. Postal Service and other traditional shipping companies and postal services.
From Prime-branded planes to smiling blue delivery trucks, the Seattle tech giant is increasingly handling the shipping and delivery of items purchased on its site.
The announcement comes a few months after Amazon’s economic impact on the USPS was publicly questioned by President Donald Trump in a series of widely discussed tweets.
The new program lets anyone run their own package delivery fleet of up to 40 vehicles with up to 100 employees. Amazon works with the entrepreneurs — referred to as “Delivery Service Partners” — and pays them to deliver packages while providing discounts on vehicles, uniforms, fuel, insurance, and more. They operate their own businesses and hire their own employees, though Amazon requires them to offer healthcare, paid time off, and competitive wages. Amazon said entrepreneurs can get started with as low as $10,000 and earn up to $300,000 annually in profit.
Amazon is increasingly relying on its own logistics infrastructure as shipping costs continue to spike with growing demand, particularly from Prime members who pay $119 per year for free 2-day shipping and access to Amazon’s 2-hour Prime Now delivery service. It now has 7,000 truck trailers and 40 jumbo jets that shuttle packages to and from 125 fulfillment centers across the world.
In the past, Amazon relied on third-party providers for costly “last mile” deliveries — those from the warehouse to the customer. But now it is experimenting with new delivery methods. In 2015, the company launched a program called Amazon Flex, the company’s Uber-like platform that lets everyday people packages with their own cars. One year later it started experimenting with its own delivery operation. Amazon also last year reportedly tested a service called “Seller Flex” in the U.S. that consists of the company picking up packages sold on its site directly from third-party warehouses.
Clark, a 19-year Amazon veteran who oversees worldwide delivery logistics infrastructure, said the new “Delivery Service Partners” program is not so much an evolution of Amazon Flex — which will continue to exist — but more of an addition to the company’s overall delivery network....MUCH MORE
In the latest TOP500 rankings announced this week, 56 percent of the additional flops were a result of NVIDIA Tesla GPUs running in new supercomputers – that according to the Nvidians, who enjoy keeping track of such things. In this case, most of those additional flops came from three top systems new to the list: Summit, Sierra, and the AI Bridging Cloud Infrastructure (ABCI).
Summit, the new TOP500 champ, pushed the previous number one system, the 93-petaflop Sunway TaihuLight, into second place with a Linpack score of 122.3 petaflops. Summit is powered by IBM servers, each one equipped with two Power9 CPUs and six V100 GPUs. According to NVIDIA, 95 percent of the Summit’s peak performance (187.7 petaflops) is derived from the system’s 27,686 GPUs.
NVIDIA did a similar calculation for the less powerful, and somewhat less GPU-intense Sierra, which now ranks as the third fastest supercomputer in the world at 71.6 Linpack petaflops. And, although very similar to Summit, it has four V100 GPUs in each dual-socked Power9 node, rather than six. However, the 17,280 GPUs in Sierra still represent the lion’s share of that system’s flops.
Likewise for the new ABCI machine in Japan, which is now that country’s speediest supercomputer and is ranked fifth in the world. Each of its servers pairs two Intel Xeon Gold CPUs with four V100 GPUs. Its 4,352 V100s deliver the vast majority of the system’s 19.9 Linpack petaflops.Working backward, here is the Top500 press release announcing the new list, June 25:
As dramatic as that 56 percent number is for new TOP500 flops, the reality is probably even more impressive. According to Ian Buck, vice president of NVIDIA's Accelerated Computing business unit, more than half the Tesla GPUs they sell into the HPC/AI/data analytics space are bought by customers who never submit their systems for TOP500 consideration. Although many of these GPU-accelerated machines would qualify for a spot on the list, these particular customers either don’t care about all the TOP500 fanfare or would rather not advertise their hardware-buying habits to their competitors.
It’s also worth mentioning that the Tensor Cores in the V100 GPUs, with their specialized 16-bit matrix math capability, endow these three new systems with more deep learning potential than any previous supercomputer. Summit alone boasts over three peak exaflops of deep learning performance. Sierra’s performance in this regard is more in the neighborhood of two peak exaflops, while the ABCI number is around half an exaflop. Taken together, these three supercomputers represent more deep learning capability than the other 497 systems on the TOP500 list combined, at least from the perspective of theoretical performance.
The addition of AI/machine learning/deep learning into the HPC application space is a relatively new phenomenon, but the V100 appears to be acting as a catalyst. “This year’s TOP500 list represents a clear shift towards systems that support both HPC and AI computing,” noted TOP500 author Jack Dongarra, Professor at University of Tennessee and Oak Ridge National Lab....MORE
FRANKFURT, Germany; BERKELEY, Calif.; and KNOXVILLE, Tenn.—The TOP500 celebrates its 25th anniversary with a major shakeup at the top of the list. For the first time since November 2012, the US claims the most powerful supercomputer in the world, leading a significant turnover in which four of the five top systems were either new or substantially upgraded.And the list:
Summit, an IBM-built supercomputer now running at the Department of Energy’s (DOE) Oak Ridge National Laboratory (ORNL), captured the number one spot with a performance of 122.3 petaflops on High Performance Linpack (HPL), the benchmark used to rank the TOP500 list. Summit has 4,356 nodes, each one equipped with two 22-core Power9 CPUs, and six NVIDIA Tesla V100 GPUs. The nodes are linked together with a Mellanox dual-rail EDR InfiniBand network.
Sunway TaihuLight, a system developed by China’s National Research Center of Parallel Computer Engineering & Technology (NRCPC) and installed at the National Supercomputing Center in Wuxi, drops to number two after leading the list for the past two years. Its HPL mark of 93 petaflops has remained unchanged since it came online in June 2016.
Sierra, a new system at the DOE’s Lawrence Livermore National Laboratory took the number three spot, delivering 71.6 petaflops on HPL. Built by IBM, Sierra’s architecture is quite similar to that of Summit, with each of its 4,320 nodes powered by two Power9 CPUs plus four NVIDIA Tesla V100 GPUs and using the same Mellanox EDR InfiniBand as the system interconnect.
Tianhe-2A, also known as Milky Way-2A, moved down two notches into the number four spot, despite receiving a major upgrade that replaced its five-year-old Xeon Phi accelerators with custom-built Matrix-2000 coprocessors. The new hardware increased the system’s HPL performance from 33.9 petaflops to 61.4 petaflops, while bumping up its power consumption by less than four percent. Tianhe-2A was developed by China’s National University of Defense Technology (NUDT) and is installed at the National Supercomputer Center in Guangzhou, China.
The new AI Bridging Cloud Infrastructure (ABCI) is the fifth-ranked system on the list, with an HPL mark of 19.9 petaflops. The Fujitsu-built supercomputer is powered by 20-core Xeon Gold processors along with NVIDIA Tesla V100 GPUs. It’s installed in Japan at the National Institute of Advanced Industrial Science and Technology (AIST).
Piz Daint (19.6 petaflops), Titan (17.6 petaflops), Sequoia (17.2 petaflops), Trinity (14.1 petaflops), and Cori (14.0 petaflops) move down to the number six through 10 spots, respectively.
Despite the ascendance of the US at the top of the rankings, the country now claims only 124 systems on the list, a new low. Just six months ago, the US had 145 systems. Meanwhile, China improved its representation to 206 total systems, compared to 202 on the last list. However, thanks mainly to Summit and Sierra, the US did manage to take the lead back from China in the performance category. Systems installed in the US now contribute 38.2 percent of the aggregate installed performance, with China in second place with 29.1 percent. These numbers are a reversal compared to six months ago.
The next most prominent countries are Japan, with 36 systems, the United Kingdom, with 22 systems, Germany with 21 systems, and France, with 18 systems. These numbers are nearly the same as they were on the previous list....MORE
June 2018Earlier today:
The TOP500 celebrates its 25th anniversary with a major shakeup at the top of the list. For the first time since November 2012, the US claims the most powerful supercomputer in the world, leading a significant turnover in which four of the five top systems were either new or substantially upgraded....MUCH MORE
Dally, chief scientist at NVIDIA, is an icon in the deep learning world. A prolific researcher with more than 150 patents, he previously chaired Stanford University’s computer science department.
Dally sat down with AI podcast host Noah Kravitz to share his reflections on artificial intelligence — a field he’s been working in for decades, which has had a renaissance thanks to GPU-driven deep learning. AI, he says, is “going to transform almost every aspect of human life.”As an example of the convergence, also from the NVIDIA blog:
Roots of the Current AI Revolution
When Dally first started his neural networks research in the 1980s, “we had computers that were literally 100,000 times slower than what we have today,” he told Kravitz.
Today’s AI revolution is enabled by powerful GPUs. But it took a lot of work to get there, such as the 2006 launch of the CUDA programming language by NVIDIA’s Ian Buck.
“The GPUs had the computational resources, and CUDA unlocked it,” Dally said.
As GPU computing gained traction, Dally met with fellow deep learning luminary Andrew Ng for breakfast. Ng was working on a now well-known project that used unsupervised learning to detect images of cats from the web.
This work took 16,000 CPUs on Google Cloud. Dally suggested they collaborate to use GPUs for this work — and so began NVIDIA’s dive into deep learning.
Dally says there are two main focus areas for neural networks going forward: building more powerful algorithms that ramp up the efficiency of doing inference, and developing neural networks that train on much less data.
Technological advancements have an “evolutionary component and a revolutionary component,” he said. “In research, we try to focus on the revolutionary part.”
Strengthening Research Culture at NVIDIA
When Dally joined NVIDIA as chief scientist in 2009, the research team had less than a dozen scientists. Today, it’s 200 strong.
Dally’s goal is for NVIDIA researchers to do excellent work in areas that will have a major impact to the company in the future. He says publishing strong research in top-tier venues is essential because it provides peer review feedback that is key for quality control.
“It’s a humbling experience,” he said. “It makes you better.”
This week, NVIDIA researchers are presenting 14 accepted papers and posters, seven of them during oral sessions, at the annual Computer Vision and Pattern Recognition conference in Salt Lake City....MORE ( the podcast)
Just weeks after its debut, Summit, the world’s fastest supercomputer, is already blasting through scientific applications crucial to breakthroughs in everything from superconductors to understanding addiction.
Summit, based at the Oak Ridge National Laboratory, in Tennessee, already runs CoMet — which helps identify genetic patterns linked to diseases — 150x faster than its predecessor, Titan. It’s running another application, QMCPACK — which handles quantum Monte Carlo simulations for discovering new materials such as next-generation superconductors — 50x faster than Titan.
The ability to quickly accelerate widely-used scientific applications such as these comes thanks to our more than a decade of investment across what technologists call “the stack.” That is, everything from architecture improvements in our GPU parallel processors to system design, software, algorithms, and optimized applications. While innovating across the entire stack hard, it’s also essential, because, with the end of Moore’s law, there are no automatic performance gains.
Summit, powered by 27,648 NVIDIA GPUs, is the latest GPU-powered supercomputer built to accelerate scientific discovery of all kinds. Built for the U.S. Department of Energy, Summit is the world’s first supercomputer to achieve over a 100 petaflops, accelerating the work of the world’s best scientists in high-energy physics, materials discovery, healthcare and more.
But Summit delivers more than just speed. Instead of one GPU per node with Titan, Summit has six Tensor Core GPUs per node. That gives Summit the flexibility to do traditional simulations along with the GPU-driven deep learning techniques that have upended the computing world since Titan was completed six years ago.
How Volta Stacks the Deck
With Volta, we reinvented the GPU. Its revolutionary Tensor Core architecture enables multi-precision computing. So it can crank through deep learning at 125 teraflops at FP16 precision. Or when greater range or precision are needed, such as for scientific simulations, it can compute at FP64 and FP32....MORE
Shares of drugstore companies are tumbling Thursday after Amazon announced it signed an agreement to acquire online pharmacy PillPack.
Shares of Walgreens Boots Alliance and CVS Health fell 9 percent and 8 percent in Thursday’s premarket session, respectively, as investors worried Amazon may disrupt the drugstore market. Rite Aid's stock also declined 8 percent.
PillPack is an online pharmacy that packages, organizes and delivers pre-sorted doses of medications.
“PillPack’s visionary team has a combination of deep pharmacy experience and a focus on technology,” Jeff Wilke, Amazon CEO Worldwide Consumer, said in a statement. “PillPack is meaningfully improving its customers’ lives, and we want to help them continue making it easy for people to save time, simplify their lives, and feel healthier. We’re excited to see what we can do together on behalf of customers over time.”...MORE
Jean Tirole is an intellectual giant in the economics world. The Frenchman is the foremost thinker on market power and regulation, and won the Nobel prize in 2014 for his work in this area.
His insights are particularly relevant today, as large tech firms grow ever larger and more powerful. Advances in technology has mostly made our lives better, but as privacy concerns rise and fake news spreads, we are starting to see the downside of giving tech companies mostly unchecked power. In the past, regulators could deal with this by breaking up firms or making them public utilities. That hasn’t happened with the tech giants, even though many people and policymakers feel like something should be done—but what?
Tirole’s recent book, Economics for the Common Good, offers some answers. The final third is a handbook on how to think about the ways technology is changing the economy, and what we can do about it. Quartz asked him some of the more pressing questions of the day.
Quartz: The early days of tech promised a ruthlessly competitive market place where even small players could reach billions at little cost. Instead, it seems we ended up with less competition. What happened?
Tirole: There is a sense in which tech has delivered. Small firms have been empowered in many ways. They can avail themselves of cheap back-office and cloud services; they can easily connect with consumers; they can fine-tune their advertising rather than engage in blind mass advertising; their access to borrowing is facilitated by AI-driven lenders, as is the case for the more than 7 million Chinese small and medium-size firms financed by Ant Financial. And, importantly, they can more easily build their own reputation. A taxi driver relied on the taxi company’s reputation; today, through ratings, the driver can have his or her own reputation on a ride-hailing platform.
But at the platform level, competition confronts the existence of large returns to scale and/or network externalities, leading to natural monopoly situations and a winner-take-all scenario. Network externalities can be direct: I am on Facebook or Twitter because you also are; I will use Uber or Lyft if many drivers do so. Network externalities can also be indirect: We may not care directly about the presence of other users on the platform, but that presence leads to improved services, as in the case of many apps or delivery services. For example, I want to use Google’s search engine or Waze if you also use them, as the quality of predictions improves with the number of users.
Natural monopoly situations lead to widespread market power, and a concomitant willingness to lose money for a long time to “buy” the prospect of a future monopoly position—think of Amazon or Uber.
Are tech firms like Google, Amazon, and Facebook monopolies?
Here we need to distinguish between statics and dynamics, or between a transient monopoly and a permanent one. Large economies of scale as well as substantial network externalities imply that we often have monopolies or tight oligopolies in the new economy. The key issue is that of “contestability.” Monopolies are not ideal, but they deliver value to the consumers as long as potential competition keeps them on their toes. They will then be forced to innovate and possibly even to charge low prices so as to preserve a large installed base and try to make it difficult for the entrants to dislodge them.
But for such competition to operate, two conditions are necessary: Efficient rivals must, first, be able to enter and, second, enter when able to. In practice, they may find it difficult to enter a market. And if they successfully enter, they may find it more profitable to be swallowed up by the incumbent rather than to compete with it. In economics parlance, such “entries for buyout” create very little social value as they are mainly a mechanism for the entrant to appropriate a piece of the dominant firm’s rent.And let us not forget Friendster.
Ten years ago it seemed like Walmart had monopoly power when it came to retail, but the market brought us Amazon. Is it possible that today’s tech monopolies will also face stiff competition one day?
Yes, and let’s not forget that Google replaced AltaVista in the search engine market and Facebook dislodged MySpace in the social network segment....MORE
Investment funds are making it easier to trade -- and short -- the riskiest type of bank debt, just as the notes’ winning streak comes to end.
Invesco Ltd. is following WisdomTree in rolling out an exchange-traded fund tied to additional Tier 1 notes, the first debt to take losses in a bank crisis. The products will make it easier to trade in a market characterized by high coupons and high volatility, as well as liquidity squeezes at times of market stress.
“Liquidity is key,” said Michael Stewart, regional ETF product developer at Invesco. Many investors trade individual AT1s and they “want an easier way to diversify issuer risk,” he said.
The two AT1 funds come to market as contingent convertible bank notes head for a first-half loss that will end three years of outperformance versus senior bank debt. The notes have crumbled, after market-trouncing returns last year, partly because the looming end of quantitative easing has boosted yields in safer types of debt.
Invesco’s AT1 Capital Bond UCITS ETF, which trades under the ticker AT1 LN in London, tracks the iBoxx USD Contingent Convertible Liquid Developed Market AT1 index. A euro-hedged class will be launched on Thursday. WisdomTree’s AT1 ETF debuted last month.
ETFs ease trading in the generally illiquid bond market as they hold a number of different notes and act as a benchmark for a whole sector. Traders can then quickly buy, sell or short shares from the fund rather than finding someone for a deal tied to a specific bond....MORE
Facebook and Google steer us into sharing vast amounts of information about ourselves, through cunning design, privacy invasive defaults, and “take it or leave it”-choices, according to an analysis of the companies’ privacy updates.
As the new General Data Protection Regulation (GDPR) is implemented across Europe, users of digital services have been confronted with new privacy settings through numerous pop-up messages. Unfortunately, The Norwegian Consumer Councils just published analysis demonstrates that companies appear to have little intention of giving users actual choices.
– These companies manipulate us into sharing information about ourselves. This shows a lack of respect for their users, and are circumventing the notion of giving consumers control of their personal data, says Finn Myrstad, director of digital services in the Norwegian Consumer Council.
The Norwegian Consumer Council and several other consumer and privacy groups in Europe and the US are now asking European data protection authorities to investigate whether the companies are acting in accordance with the GDPR and US rules.
Sharing by default
Through the Consumer Council’s analysis of the companies’ privacy pop-ups, it is made evident that consumers are pushed into sharing through;
Standard SettingsResearch has shown that users rarely change pre-selected settings. In many cases, both Facebook and Google have set the least privacy friendly choice as the default.
Cunning design choicesSharing of personal data and the use of targeted advertising are presented as exclusively beneficial through wording and design, often in combination with threats of lost functionality if users decline....
In this report, we analyze a sample of settings in Facebook, Google and Windows 10, and show how default settings and dark patterns , techniques and features of interface design meant to manipulate users, are used to nudge users towards privacy intrusive options. The findings include privacy intrusive default settings, misleading wording, giving users an illusion of control, hiding away privacy - friendly choices, take - it - or - leave - it choices, and choice architectures where choosing the privacy friendly option requires more effort for the users....MUCH MORE
Facebook and Google have privacy intrusive defaults, where users who want the privacy friendly option have to go through a significantly longer process. They even obscure some of these settings so that the user cannot know that the more privacy intrusive option was preselected.
The popups from Facebook, Google and Windows 10 have design, symbols and wording that nudge users away from the privacy friendly choices. Choices are worded to compel users to make certain choices, while key inform ation is omitted or downplayed. None of them lets the user freely postpone decisions. Also, Facebook and Google threaten users with loss of functionality or deletion of the user account if the user does not choose the privacy intrusive option.
The GDPR set tings from Facebook, Google and Windows 10 provide users with granular choices regarding the collection and use of personal data. At the same time, we find that the service providers employ numerous tactics in order to nudge or push consumers toward sharing as much data as possible.
To complement the analysis, we use two examples of how users are given an illusion of control through privacy settings. Firstly, Facebook gives the user an impression of control over use of third party data to show ads, while it turns out that the control is much more limited than it initially appears. Secondly, Google’s privacy dashboard promises to let the user easily delete user data, but the dashboard turns out to be difficult to navigate, more resembling a maze than a tool for user control....
In graphic and web design, a dark pattern is "a user interface that has been carefully crafted to trick users into doing things, such as buying insurance with their purchase or signing up for recurring bills." The neologism dark pattern was coined by Harry Brignull in August 2010 with the registration of darkpatterns.org, a "pattern library with the specific goal of naming and shaming deceptive user interfaces." ......MORE