Twitter chief executive to defend company before Congress

WASHINGTON (Reuters) – Twitter Inc Chief Executive Jack Dorsey will tell Congress on Wednesday the company “does not use political ideology to make any decisions.”

FILE PHOTO: Jack Dorsey, CEO and co-founder of Twitter and founder and CEO of Square, speaks at the Consensus 2018 blockchain technology conference in New York City, New York, U.S., May 16, 2018. REUTERS/Mike Segar

Dorsey will testify before the U.S. House of Representatives’ Energy and Commerce Committee on Wednesday after Republicans raised concerns about how the social media platform polices content.

“From a simple business perspective and to serve the public conversation, Twitter is incentivized to keep all voices on the platform,” said Dorsey’s written testimony, which was made public on Tuesday. He added that a recent company review shows “no statistically significant difference” between how often tweets by Republican and Democratic members of Congress are viewed by Twitter users.

Reporting by David Shepardsonl; editing by Jonathan Oatis

Samsung Q8FN 4K Review: A Pretty But Pricey 4K Television

One of the best perks of the job is that I get to try some tech toys that are simply out of my price range. From high-end cameras to bonkers-expensive pro laptops, I realize I’m pretty spoiled. That’s why when I had the chance to try one of the newest Samsung 4K TVs in my apartment, a sense of dread came over me. Would swapping my dinky, three-year-old 40-inch for an expansive, pricey, 55-inch 4K unit ruin my life? Would I feel compelled to immediately jump onto the higher-def bandwagon and sell one of my kidneys for the pleasure?

Having now returned the Samsung TV to its rightful owners, I’m inclined to say no. This awe-inspiring quantum-dot-packing eye-fatiguingly luminous television didn’t quite make me rethink my entire existence. But why didn’t this luxe flat panel transform my low-contrast, standard dynamic range life into something brighter and happier than my cheap, old Sony? Two words: Frasier Crane.

Out and In

Opening the box and setting up the Samsung Q8FN was a joy. The panel is big, but it wasn’t too hard to wrestle the set out of the box by myself. Thankfully, instead of a complex stand, the two metal feet are held in by clips, and you won’t have to touch a screwdriver to get it onto your entertainment center. Even though I kind of miss the versatility of Samsung’s OneConnect system (which broke out the TV’s ports onto a separate box instead of leaving them all tucked away behind the TV), this year’s Q8FN seems way less cluttered. On its own on my TV stand, it struck a clean, austere profile.

Though they’re not effortlessly accessible, the Q8FN’s ports are at least plentiful. With four HDMI ports and a few USBs, you’ll be able to plug in plenty of inputs, be they PlayStation, Nintendo Switch, Blu-ray, or Apple TV. Because it’s a 4K HDR-capable set, I opted to plug in an Xbox One X, which can play streaming and disc-based media in that high resolution.

The OneRemote clicker is similar to what other high-end Samsung TVs include, eschewing a number pad for a simple iPod-like direction ring, two rockers for volume and channels, and a few other controls. The remote includes a microphone and a voice command trigger for use with Bixby, but I didn’t find it all that useful, since I eschewed the TV’s built-in smart platform and broadcast TV for streaming via the Xbox.

I’m Listening

After giving the set a few hours of break-in (I popped in my 4K Blu-ray of Star Wars: The Last Jedi and ran the film on a loop during an afternoon), I sat down and started checking out what there was on Netflix. I’ve been watching some classic shows recently, mostly switching between stretches of Frasier and Star Trek: Voyager. The problem? Both of these SD shows, no matter how the Samsung’s Q Engine chip tries to upscale, look terrible blown up on this TV. The resolution delta doesn’t help, but the big, 55-inch size made the poor compression and lack of detail so obvious compared to our rinky-dink 40-inch set.

The sound wasn’t so hot either, since the speakers seem to be rear-firing and lacking in bass. Definitely invest in a sound bar.

But then there’s high-def content. The first movie I spent time watching on the TV was the terrific Black Panther in 4K HDR. This movie blew me away on this Samsung. Action scenes like the nighttime car chase in Busan, Korea pushed the TV’s HDR to its limits. Thankfully, the set made short work of the fast-moving action and contrasty visuals, arguably making the movie look better than it did when I saw it in theaters. Even normal HD stuff like Netflix’s Luke Cage was spectacular to watch, bringing the show’s version of Harlem, with its resident baddies and goodies, vividly to life.

Games look great too—I spent hours flying space fighters in Star Wars: Battlefront II and was in awe of the blackness of space, nearly blinded by the glare off of nearby planets and capital ships. Granted, the Xbox One X is geared for high resolution gaming, and when I changed inputs to the connected Nintendo Switch, Super Mario Odyssey was noticeably less crispy—it’s in HD, after all.

While this set’s “QLED” tech won’t give you OLED-level blacks, I was pleasantly surprised at the contrast the TV was able to output. With the full array local dimming turned to the lowest setting, and the TV’s brightness reduced a bit (out of the box, the Q8FN was aggressively bright) did a convincing OLED impression. I didn’t notice weird blooming or lag between when an object appeared on screen and when the closest backlight portion boosted its brightness. For an LCD, this Samsung justifies its premium perch in the QLED lineup.

When my review period was up, I was sad to have to put this gorgeous slab back into its box. Switching back to our old 40-inch Sony took an adjustment period, but, all things considered, I adjusted quickly. I went back to my routine of Frasier and Voyager, and though it isn’t as all-enveloping as our old HDTV, I was happy to watch the low-quality show on a smaller, duller TV. I’ll be shopping for a 4K TV later this year. But maybe I’ll wait until I’m done with my ’90s TV binge.

2 Streaming Amps for Audiophiles: Naim Uniti, Bluesound

Streaming music doesn’t have to mean compromised sound. These hi-fi amps can help you find cloud-connected aural ecstasy.

1. Naim Audio Uniti Star

Best for: Streamcurious audiophiles

With a built-in CD player that rips tracks to a local drive, the Uniti Star eases the pain of parting with your CDs. Naim’s app summons your newly captured tunes and streams hi-res songs from cloud services. The hardware is pricey, but you get premium guts like a 70-watt-per-channel amp and a huge, velvet-­smooth volume knob.

$5,995
Buy Now

2. Bluesound Powernode 2

Best for: Proud digital natives

Bluesound’s more modestly priced streamer can access oodles of cloud music services and radio stations—including hi-res offerings—or play a local library stacked with FLACs. Basic panel controls are supplemented by the excellent BluOS Controller app. The integrated 60-watt-per-channel amp can power any speakers, from tiny to towering.

$799
Buy Now

Styling by Reina Takahashi

This article appears in the August issue. Subscribe now.


More Great WIRED Stories

Fleeing White House Lawyers Top This Week's Internet News Roundup

It’s been a week that’s seen us inch ever closer to the collapse of NAFTA, seen the White House seemingly confused about how it collectively feels about the death of John McCain, and seen the official death toll of Hurricane Maria in Puerto Rico raised by almost 3,000, even though the President still claims the official response was “fantastic”. (No wonder his disapproval rating has hit an all-new high.) But what else has been going on this week? I’m glad you asked! Let’s let the internet answer that question, shall we?

You’re Fired (483rd Twitter Edition)

What Happened: Of all the people the President of the United States has pushed out of the White House, perhaps the White House lawyer wasn’t the best choice.

Where It Blew Up: Twitter, media reports

What Really Happened: Elsewhere in the legal worries of the leader of the free world, the reportedly perfectly fine, nothing wrong whatsoever relationship between President Trump and White House lawyer Don McGahn took a bit of a turn early this week, as the President tweeted out a personnel update.

Well, this seems perfectly normal and not something that people were cynically expecting after it emerged that McGahn had multiple meetings with Special Counsel Robert Mueller over the past few months. Still, at least he was given time to prepare for this decision…

On the plus side, everyone in Trump’s orbit must have been happy to see him go…

That’s 84-year-old Republican senator Chuck Grassley there, showing some hey-fellow-kids Twitter chops.

Even as everyone was still coming to terms with the White House lawyer being unceremoniously dismissed without notice, some people had some more thoughts to offer on how this related to the bigger picture:

But as with seemingly every bit of reporting, the President couldn’t resist taking to Twitter to argue against the conventional wisdom in his patented “Nuh-uh, just the opposite!” style, as was obvious on Thursday morning:

As should probably be expected at this point, most people took this as confirmation that just the opposite was actually true. But a third tweet made ears perk up amongst the political watchers:

The replacement in question…? That’s an open question at time of writing, thanks to entirely conflicting reports:

Hey, maybe Rudy Giuliani could moonlight once he’s finished working on that counter-report.

The Takeaway: Curiously, McGahn wasn’t the only lawyer to leave the White House this week, although this departure was seemingly more voluntary:

Alienated Citizens of the World, Unite

What Happened: For those who thought that the current administration couldn’t do anything to get more racist, I introduce to you: Telling U.S. citizens they aren’t really citizens because they’re Hispanic.

Where It Blew Up: Twitter, media reports

What Really Happened: As if there weren’t enough reasons to feel concerned about the administration’s attitude towards immigration (Hundreds of children are still separated from their parents, in case you’re wondering), a new report from the middle of this week brought an additional wrinkle:

The Washington Post’s report alleged that American citizens were getting passport applications rejected in Texas, with “hundreds, and possibly thousands” of Hispanic citizens being accused of using fake birth certificates.

To call this a big deal would be a severe understatement, and the original report was quickly shared by other outlets across the internet. Twitter, too, was shocked by what was happening:

As might be expected, the State Department pushed back on the reports, but there was one obvious problem with that…

Oh, and it’s not just passports or the administration, as it turns out:

The Takeaway: Yeah, this isn’t terrifying in the slightest. Maybe there’s a silver lining to be found somewhere…

From Give and Take and Still Somehow

What Happened: The President and his lawyers have come up with a new plan to combat the special investigation into potential collusion with Russia; release its own fake report. No, really.

Where It Blew Up: Twitter, media reports

What Really Happened: You know what they always say: If you can’t beat them, release your own version of something and just pretend that they’re entirely equivalent. And speaking of the current special counsel investigation into the President of the United States and potential collusion with foreign entities…

There are all manner of obvious flaws in this plan, such as who would believe a report put together by the subject of the investigation? (I mean, sadly, we know the answer, but still.) There’s also this small drawback:

That is a problem. How can you write a rebuttal to a mystery topic…?

Actually, the apparent truth is only incrementally less likely:

Somewhat amazingly, this turns out not to be the first time the subject has been raised publicly by Giuliani, the president’s personal attorney. But, sure, this definitely sounds like a good use of everyone’s time:

If nothing else, he’ll have to work quickly in order to—as the original report put it—release the report within minutes of Mueller’s official, actually researched, report.

Let’s be real: There’s almost no way this could fail.

The Takeaway: Who couldn’t be convinced by a well-reasoned argument from this guy?

Why They Changed It I Can’t Say

What Happened: New York got an unexpected name change this week on certain apps, thanks to an act of anti-semitic “digital graffiti.”

Where It Blew Up: Twitter, media reports

What Really Happened: New York users of Snapchat, the Weather Channel, and other online services with map services received an unpleasant surprise on Thursday morning:

Of course, this quickly went viral, because of course it did. The root, as it happened, was quickly identified—

—and dealt with:

But what caught some people’s attention was the choice of slur city name—and how much of a failure it ultimately was:

Others wondered if New York’s new identity could be an improvement:

Sadly, not everyone was happy with the takeover:

The Takeaway: It wouldn’t be a New York moment without at least one person fondly remembering the good old days…

Slight Return

What Happened: After less than a year away, Louis C.K. has stepped back into the spotlight to return to comedy—and it turns out people aren’t really into that idea so much.

Where It Blew Up: Twitter, media reports

What Really Happened: Hey, remember last November, when comedian Louis C.K. admitted that reports of his sexually harassing several women, including masturbating in front of them, were true? Remember when he issued a statement saying that he was going to “step back and take a long time to listen”?

Well, that was certainly nine months’ worth of listening, I guess. Yes, Louis C.K. returned to the public stage this week (although it turns out he’d actually made a more low-key comeback earlier than that), and it was a return that prompted a very strong response online.

With all kinds of think pieces published in response, the overall feeling about C.K.’s return could be summed up in one simple tweet:

As if to illustrate that last point, an additional fact about C.K.’s set emerged a day later…

The Takeaway: But perhaps we’re being too hard on the comedian…


More Great WIRED Stories

We Need To Reengineer Our Organizations For A New Era Of Innovation

In the first half of the 20th century, Alfred Sloan created the modern corporation at General Motors. In many ways, it was based on the military. Senior leadership at headquarters would make plans, while managers at individual units would be allocated resources and made responsible for for achieving mission objectives.

The rise of digital technology made this kind of structure untenable. By the time strategic information was gathered centrally, it was often too old to be effective. In much the same way, by the time information flowed up from operating units, it was too late to alter the plan. It had already failed.

So in recent years, the management mantra has become agility and iteration. Due to pressures from the market and from shareholders, long-term planning is often eschewed for the needs of the moment. Yet today the digital era is ending and organizations will need to shift once again. We’re going to need to learn to combine long-range planning with empowered execution.

Shifting from Iteration to Exploration

When Steve Jobs came up with the idea for a device that would hold “a thousand songs in my pocket,” it wasn’t technically feasible. There was simply no hard drive available that could fit that much storage into that little space. Nevertheless, within a few years a supplier developed the necessary technology and the iPod was born.

Notice how the bulk of the profits went to Apple, which designed the application and very little to the supplier that developed the technology that made it possible.That’s because the technology for developing hard drives was very well understood. If it hadn’t been that supplier, another would have developed what Jobs needed in six months or so.

Yet today, we’re on the brink of a new era of innovation. New technologies, such as revolutionary computing architectures, genomics and artificial intelligence are coming to the fore that aren’t nearly as well understood as digital technology. So we will have to spend years learning about them before we can develop applications safely and effectively.

For example, companies ranging from Daimler and Samsung to JP Morgan Chase and Barclays have joined IBM’s Q Network to explore quantum computing, even though that it will be years before that technology has a commercial impact. Leading tech companies have formed the Partnership on AI to better understand the consequences for artificial intelligence. Hundreds of companies have joined manufacturing hubs to learn about next generation technology.

It’s becoming more important to prepare than adapt. By the time you realize the need to adapt, it may already be too late.

Building a Pipeline of Problems to be Solved

While the need to explore technologies long before they become commercially viable is increasing, competitive pressures show no signs of abating. Just because digital technology is not advancing the way it once did doesn’t mean that it will disappear. Many aspects of the digital world, such as the speed at which we communicate, will continue.

So it is crucial to build a continuous pipeline of problems to solve. Most will be fairly incremental, either improving on an existing product or developing new ones based on standard technology. Others will be a bit more aspirational, such as applying existing capabilities to a new market or adopting new technology to improve service to existing customers.

However, as the value generated from digital technology continues to level off, much like it did for earlier technologies like internal combustion and electricity, there will be an increasing need to pursue grand challenges to solve fundamental problems. That’s how truly new markets are created.

Clearly, this presents some issues with resource allocation. Senior managers will have to combine the need to move fast and keep up with immediate competitive pressures with the long-term thinking it takes to invest in years of exploration with an uncertain payoff. There’s no magic bullet, but it is generally accepted that the 70/20/10 principle for incremental, adjacent and fundamental innovation is a good rule of thumb.

Empowering Connectivity

When Sloan designed the modern corporation, capacity was a key constraint. The core challenge was to design and build products for the mass market. So long-term planning to effectively organize plant, equipment, distribution and other resources was an important, if not decisive, competitive attribute.

Digitization and globalization, however, flipped this model and vertical integration gave way to radical specialization. Because resources were no longer concentrated in large enterprises, but distributed across global networks, integration within global supply chains became increasingly important.

With the rise of cloud technology, this trend became even more decisive in the digital world. Creating proprietary technology that is closed off to the rest of the world has become unacceptable to customers, who expect you to maintain API’s that integrate with open technologies and those of your competitors.

Over the next decade, it will become increasingly important to build similar connection points for innovation. For example, the US military set up the Rapid Equipping Force that was specifically designed to connect new technologies with soldiers in the field who needed them. Many companies are setting up incubators, accelerators and corporate venture funds for the same reason. Others have set up programs to connect to academic research.

What’s clear is that going it alone is no longer an option and we need to set up specific structures that not only connect to new technology, but ensure that it is understood and adopted throughout the enterprise.

The Leadership Challenge

The shift from one era to another doesn’t mean that old challenges are eliminated. Even today, we need to scale businesses to service mass markets and rapidly iterate new applications. The problems we need to take on in this new era of innovation won’t replace the old ones, they will simply add to them.

Still, we can expect value to shift from agility to exploration as fundamental technologies rise to the fore. Organizations that are able to deliver new computing architectures, revolutionary new materials and miracle cures will have a distinct competitive advantage over those who can merely engineer and design new applications.

It is only senior leaders that can empower these shifts and it won’t be easy. Shareholders will continue to demand quarterly profit performance. Customers will continue to demand product performance and service. Yet it is only those that are able to harness the technologies of this new era — which will not contribute to profits or customer satisfaction for years to come — that will survive the next decade.

The one true constant is that success eventually breeds failure. The skills and strategies of one era do not easily translate to another. To survive, the key organizational attribute will not be speed, agility or even operational excellence, but leadership that understands that when the game is up, you need to learn how to play a new one.

This Little Known Program at the Department of Energy Is Helping to Create a New Future In Manufacturing

In the recession that followed the dotcom crash in 2000, the United States lost 5 million manufacturing jobs and, while there has been an uptick in recent years, all indications are that they are never coming back. Manufacturing, perhaps more than any other sector, relies on deep networks of skills and assets that tend to be regional.

The consequences of this loss are deep and pervasive. Losing a significant portion of our manufacturing base has led not only to economic vulnerability, but to political polarization. Clearly, it is important to rebuild our manufacturing base. But to do that, we need to focus on new, more advanced, technologies

That’s the mission of the Advanced Manufacturing Office (AMO) at the Department of Energy. By providing a crucial link between the cutting edge science done at the National Labs and private industry, it has been able to make considerable progress. As the collaboration between government scientists widen and deepens over time, US manufacturing may well be revived. 

Linking Advanced Research to Private Industry

The origins of the Department of Energy date back to the Manhattan Project during World War II. The immense project was, in many respects, the start of “big science.” Hundreds of top researchers, used to working in small labs, traveled to newly established outposts to collaborate at places like Los Alamos, New Mexico and Oak Ridge, Tennessee.

After the war was over, the facilities continued their work and similar research centers were established to expand the effort. These National Labs became the backbone of the US government’s internal research efforts. In 1977, the National Labs, along with a number of other programs, were combined to form the Department of Energy.

I was able to visit the Innovation Crossroads facility at Oak Ridge National Laboratory and meet the entrepreneurs in its current cohort. Each is working transform a breakthrough discovery into a market changing application, yet due to technical risk, would not be able to attract funding in the private sector. The LEAP program offers a small amount of seed money, access to lab facilities and scientific and entrepreneurial mentorship to help them get off the ground.

That’s just one of the ways that the AMO opens up the resources of the National Labs. It also helps business get access to supercomputing resources (5 out of the 10 fastest computers in the world are located at the National Labs) and conducts early stage research to benefit private industry.

Leading Public-Private Consortia

The idea behind these consortia is to create hubs that provide a critical link with government labs, top scientists at academic universities and private companies looking to solve real-world problems. It both helps firms advance in key areas and allows researchers to focus their work on areas that will have the greatest possible impact.

For example, the Critical Materials Institute (CMI) was set up to develop alternatives to materials that are subject to supply disruptions, such as the rare earth elements that are critical to many high tech products and are largely produced in China. It recently developed, along with several National Labs and Eck Industries, an advanced alloy that can replace more costly materials in components of advanced vehicles and aircraft.

“We went from an idea on a whiteboard to a profitable product in less than two years and turned what was a waste product into a valuable asset,” Robert Ivester, Director of the Advanced Manufacturing Office told me.

Technology Assistance Partnerships

In 2011, the International Organization for Standardization released its ISO 50001 guidelines. Like previous guidelines that focused on quality management and environmental impact, ISO 50001 recommends best practices to reduce energy use. These can benefit businesses through lower costs and result in higher margins.

Still, for harried executives facing cutthroat competition and demanding customers, figuring out how to implement new standards can easily get lost in the mix. So a third key role that the AMO plays is to assist companies who wish to implement new standards by providing tools, guides and access to professional expertise.

The AMO offers similar support for a number of critical areas, such as prototype development and also provides energy assessment centers for firms that want to reduce costs.  “Helping American companies adopt new technology and standards helps keep American manufacturers on the cutting edge,” Ivester says.

“Spinning In” Rather Than Spinning Out

Traditionally we think of the role of government in business largely in terms of regulation. Legislatures pass laws and watchdog agencies enforce them so that we can have confidence in the the food we eat, the products we buy and the medicines that are supposed to cure us. While that is clearly important, we often overlook how government can help drive innovation.

Inventions spun out of government labs include the Internet, GPS and laser scanners, just to name a few. Many of our most important drugs were also originally developed with government funds. Still, traditionally the work has mostly been done in isolation and only later offered to private companies through licensing agreements.

What makes the Advanced Manufacturing Office different than most scientific programs is that it is more focused on “spinning in” private industry rather than spinning out technologies. That enables executives and entrepreneurs with innovative ideas to power them with some of the best minds and advanced equipment in the world.

As Ivester put it to me, “Spinning out technologies is something that the Department of Energy has traditionally done. Increasingly, we want to spin ideas from industry into our labs, so that companies and entrepreneurs can benefit from the resources we have here. It also helps keep our scientists in touch with market needs and helps guide their research.”

Make no mistake, innovation needs collaboration. Combining the ideas from the private sector with the cutting edge science from government labs can help American manufacturing compete for the 21st century.

The Ecologist on a Mission to Count New York's Whales

The first thing you notice about ecologist Arthur Kopelman is his giant white beard. The second is the gold whale charm dangling from his earlobe—a symbol of the creature that has consumed his thoughts for decades.

“I don’t think I’ve ever seen him without it,” says Joe Carrotta, a photographer who documented Kopelman’s whale-watching cruises up and down the New York coast last summer. The boat rides allow Kopelman to collect data for the Coastal Research and Education Society of Long Island—an organization he co-founded in 1996—while also educating passengers about the incredible cetaceans and pinnipeds swimming (and singing) just miles from shore.

“People are surprised to learn there are marine mammals in New York,” Kopelman says, “perhaps because it’s an area that also has some of the densest human populations in the world.”

The New York Bight—a coastal region stretching from the northern tip of Long Island to southern New Jersey—is a frolicking ground for 19 species of whales, dolphins and porpoises, as well as four species of seal. But in the 1950s, when Kopelman was just a kid in Queens, few people thought about them; whales were mythic figures from the past, long banished by industrial pollution and hunting. But following the Clean Water and Marine Mammal Protection Acts of 1972, they returned. Today, hundreds of humpback, fin and right whales cruise the bight at any time, gobbling up schools of menhaden, a silver fish too oily for Manhattan’s delicatessens.

It’s not all great on the open water, though. While humpback populations are increasing, right whales aren’t doing so well—last year, 17 out of the 450 inhabiting the North Atlantic were killed in Canadian and US waters. Counting the communities has become so crucial, allowing researchers to monitor their abundance and distribution. Organizations like Gotham Whale and the Wildlife Conservation Society do so within the harbors and near Fire Island, while CRESLI does so on the eastern end of the bight.

But that’s not all the cruises are for. “Besides counting, my objective is to educate people about the whales, so they become informed stakeholders who will protect them,” Kopelman says.

Oddly enough, Kopelman began his scientific career in the 1970s studying a creature several orders in magnitude smaller: the Leptopilina boulardi, a two-millimeter wasp that lays its eggs in the larvae of fruit flies. He did that for nine years before switching phyla to whales. “I’d always been an activist,” he says, “and I decided to put my actions where my rhetoric was.” This single-minded passion fascinated Carrotta when the two met in 2016, inspiring Carrotta to tag along on 10 whale-watching cruises and seal walks.

All took place via the 140-foot-long Viking Starship, a gleaming vessel the captain steered toward known whale feeding spots and other places whales were recently reported. Passengers on board marveled as they saw humpbacks break the surface, slapping their fins and tails around to communicate. Kopelman kept a log of the cetaceans and pelagic birds they saw, snapping photos of the animals’ patterning to add to his searchable database of nearly 80,000 images. When he saw a familiar animal, he called out its name—”Draco,” “Glo,” “Infinity”—over the PA system. “He’s very serious about marine life, but you can still hear his excitement when he gets to talking about it, even over the loudspeaker,” Carrotta says.

Carotta photographed it all with a couple Nikon DSLRs and a Profoto strobe. His images sketch a vivid portrait of Kopelman and the charismatic megafauna that inspires his life’s work—and, occasionally, fashion accessories. Sadly, Kopelman lost his whale charm this summer. “I came home from a day on the water and it was gone,” he says. Not to worry, though: He had a backup.


More Great WIRED Stories

German 5G auction roaming proposal keeps barriers high for new entrants

BERLIN (Reuters) – German mobile phone operators will not be required to allow national roaming when they roll-out 5G services, the country’s network agency said in a document, which could make it harder for new entrants to take on the incumbent providers.

FILE PHOTO: A 5G sign is seen during the Mobile World Congress in Barcelona, Spain February 28, 2018. REUTERS/Yves Herman/File Photo -/File Photo

The auction for the 5G, or fifth generation, spectrum licenses in Europe’s largest telecoms market is planned for early 2019 and details are being closely-watched to determine whether smaller contenders will have a realistic chance to compete with the established players.

The three existing operators – Deutsche Telekom, Vodafone and Telefonica Deutschland – have pushed back against calls from potential entrants and the German cartel office for auction terms that would lower barriers to entry.

But there are concerns among industry analysts that market concentration has left Europe’s largest economy lagging its rivals in the race to build network-dependent connected factories or put self-driving cars on the road.

Germany’s antitrust regulator last week called for a fourth mobile operator to enter the market for the 5G auction, rebutting arguments from the “Big Three” that more competition would hit investment.

The proposed terms for the 5G auction, laid out in a document from network agency Bundesnetzagentur (BNetzA), do not include a binding commitment to allow national roaming – which would let a new entrant rent network access where it lacks coverage – a key demand of smaller player United Internet, which is considering bidding.

Without the roaming commitment the incumbent operators can choose whether or not they want to allow the new entrants access to their networks, for example, in rural areas.

The proposed terms, which will be discussed by BNetzA’s advisory board on Sept. 24, also foresee that at least 98 percent of German households need to be supplied with a high-speed connection of 100 megabits per second by the end of 2022.

At least 50 megabits per second must be available for busy regional and long-distance railroad traffic lines.

“The coverage obligations may not be not quite as extraordinary as BNetzA describes them but at first sight do impose material capex requirements on the network operators, in our view,” Jefferies analysts wrote in a note.

Asked about auction proceeds, BNetzA president Jochen Homann told German paper Handelsblatt that the government was unlikely to generate proceeds in line with those from the UMTS, or 3G, auction in 2000, which amounted to 50 billion euros ($58 billion).

Deutsche Telekom, late on Thursday, said it expected the BNetzA to refrain from further regulatory interventions into the mobile phone market, adding the proposals were counterproductive.

Vodafone, meantime, criticized requirements to supply federal main roads with 100 megabits per second as “unacceptable”, warning of high costs.

($1 = 0.8568 euros)

Reporting by Nadine Schimroszik and Markus Wacket. Writing by Christoph Steitz. Editing by Jane Merriman and Kirsten Donovan

Medtech firms get personal with digital twins

HEIDELBERG, Germany (Reuters) – Armed with a mouse and computer screen instead of a scalpel and operating theater, cardiologist Benjamin Meder carefully places the electrodes of a pacemaker in a beating, digital heart.

A three-dimensional printout of a human heart is seen at the Heidelberg University Hospital (Universitaetsklinikum Heidelberg) in Heidelberg, Germany, August 14, 2018. REUTERS/Ralph Orlowski

Using this “digital twin” that mimics the electrical and physical properties of the cells in patient 7497’s heart, Meder runs simulations to see if the pacemaker can keep the congestive heart failure sufferer alive – before he has inserted a knife.

The digital heart twin developed by Siemens Healthineers is one example of how medical device makers are using artificial intelligence (AI) to help doctors make more precise diagnoses as medicine enters an increasingly personalized age.

The challenge for Siemens Healthineers and rivals such as Philips and GE Healthcare is to keep an edge over tech giants from Alphabet’s Google to Alibaba that hope to use big data to grab a slice of healthcare spending.

With healthcare budgets under increasing pressure, AI tools such as the digital heart twin could save tens of thousands of dollars by predicting outcomes and avoiding unnecessary surgery.

A shortage of doctors in countries such as China is also spurring demand for new AI tools to analyze medical images and the race is on to commercialize products that could shake up healthcare systems around the world.

While AI has been used in medical technology for decades, the availability of vast amounts data, lower computing costs and more sophisticated algorithms mean revenues from AI tools are expected to soar to $6.7 billion by 2021 from $811 million in 2015, according to a study by research firm Frost & Sullivan ww2.frost.com.

The size of the global medical imaging analytics software market is also expected to jump to $4.3 billion by 2025 from $2.4 billion in 2016, said data portal Statista www.statista.com.

“What started as an evolution is accelerating towards more of a revolution,” said Thomas Rudolph who leads McKinsey & Company’s www.mckinsey.com pharma and medical technology practice in Germany.

‘GPS OF HEALTHCARE’

For Siemens Healthineers and its traditional rivals, making the transition from being mainly hardware companies to medical software pioneers is seen as crucial in a field becoming increasingly crowded with new entrants.

Google has developed a raft of AI tools, including algorithms that can analyze medical images to diagnose eye disease, or sift through digital records to predict the likelihood of death.

Alibaba, meanwhile, hopes to use its cloud and data systems to tackle a shortage of medical specialists in China. It is working on AI-assisted diagnosis tools to help analyze images such as CT scans and MRIs.Siemens Healthineers, which was spun off from German parent Siemens in March, has outpaced the market in recent quarters with sales of medical imaging equipment thanks to a slew of new products.

But analysts say the German firm, Dutch company Philips and GE Healthcare, a subsidiary of General Electric, will all come under pressure to prove they can save healthcare systems money as spending becomes more linked to patient outcomes and as hospitals rely on bulk purchasing to push for discounts.

A general view of the Klaus-Tschira-Institute for Integrative Computational Cardiology, department of the Heidelberg University Hospital (Universitaetsklinikum Heidelberg), in Heidelberg, Germany, August 14, 2018. REUTERS/Ralph Orlowski

Siemens Healthineers has a long history in the industry. It made the first industrially manufactured X-ray machines in 1896 and is now the world’s biggest maker of medical imaging equipment.

Now, Chief Executive Bernd Montag’s ambition is to transform it into the “GPS of healthcare” – a company that harnesses its data to sell intelligent services, as well as letting smaller tech firms develop Apps feeding off its database.

As it adapts, Siemens Healthineers has invested heavily in IT. It employs some 2,900 software engineers and has over 600 patents and patent applications in machine learning.

It is not alone. Philips says about 60 percent of its research and development (R&D) staff and spending is focused on software and data science. The company said it employs thousands of software engineers, without being specific.

MEDICAL REVOLUTION

Experts say the success of AI in medical technology will hinge on access to reliable data, not only to create models for diagnosis but also to predict how effective treatments will be for a specific patient in the days and years to come.

“Imagine that in the future, we have a patient with all their organ functions, all their cellular functions, and we are able to simulate this complexity,” said Meder, a cardiologist at Heidelberg University Hospital here in Germany who is testing Siemens Healthineers’ digital heart software.

“We would be able to predict weeks or months in advance which patients will get ill, how a particular patient will react to a certain therapy, which patients will benefit the most. That could revolutionize medicine.”

To this end, Siemens Healthineers has built up a vast database of more than 250 million annotated images, reports and operational data on which to train its new algorithms.

In the example of the digital twin, the AI system was trained to weave together data about the electrical and physical properties and the structure of a heart into a 3D image.

One of the main challenges was hiding the complexity and creating an interface that is easy to use, said Tommaso Mansi, a senior R&D director at Siemens Healthineers who developed the software.

To test the technology, Meder’s team created 100 digital heart twins of patients being treated for heart failure in a six-year trial. The computer makes predictions based on the digital twin and they are then compared with actual outcomes.

His team hopes to finish evaluating the predictions by the end of 2018. If the results are promising, the system will be tested in a larger, multi-center trial as the next step to getting the software approved by regulators for commercial use.

Siemens Healthineers declined to say when the technology might eventually be used by clinics or give details on how its digital heart, or models of other organs it is developing such as the lungs and liver, could be monetized.

Slideshow (15 Images)

IN DATA WE TRUST

Both GE and Philips are also working on versions of digital heart twins while non-traditional players have been active too.

Drawing on its experience of making digital twins to test bridges and machinery, French software firm Dassault Systemes launched the first commercial “Living Heart” model in May 2015, though it is only currently available for research.

Philips sells AI-enabled heart models that can, for example, turn 2D ultrasound images into data that helps doctors diagnose problems, or automatically analyze scans to help surgeons plan operations.

Its vision, like Siemens Healthineers, is to add more complexity to its existing heart models by pulling together scans, ECGs and medical records to create a model that can predict how a heart will respond to therapy in real life.

For now, such software is still in the early stages of development and companies will have to work with regulators to thrash out how predictive models can be approved before doctors are willing to trust a diagnosis generated by a machine.

Access to high-quality data with enough variation will be crucial, as will be the ability to interpret that data and turn it into something medical professionals can use, say experts.

In particular, models will have to be trained on rare cases as they get closer to perfection, said Vivek Bhatt, chief technology officer at GE Healthcare’s clinical care solutions division.

“It’s going to be extremely critical to have an ongoing process for getting more data, getting the right kind of data and getting data with those unique cases,” he said.

The established medtech players say their long-running relationships with hospitals and research institutes and vast networks of installed machines will give them an edge over new tech entrants.

Siemens Healthineers, GE Healthcare and Philips say their databases are fed with a mixture of publicly-available data, data from clinical trials or from collaborations with hospitals – as well as some data from customers. All the data is made anonymous and only used with patients’ consent, they say.

Still, some campaigners and academics worry about patients’ data being used primarily by companies as a commercial tool.

Boris Bogdan, managing director at Accenture’s www.accenture.com life science practice in Switzerland, believes the ownership of data is a gray zone that could lead to a patient backlash if companies start making fortunes from it.

“When Facebook started nobody really cared who owned the information,” he said.

“Now that people understand that Facebook earns tremendous money with their data, questions like data privacy, data usage and data monetization are becoming more visible.”

(This version of the story has been corrected to fix typo in headline.)

Reporting by Caroline Copley; editing by David Clarke

I'm Not Here to Make Friends: The Rise and Fall of the Supercut Video

In the summer of 2008, Rich Juzwiak was working as a culture writer for VH1’s website, a job that required him to keep up with a daunting assortment of reality-television shows. Some were well-established hits: Survivor, The Apprentice, Big Brother. Others were single-season oddities, like The White Rapper Show or Crowned. No matter what show he was watching or reading about, Juzwiak had noticed that, at some point, one of the contestants invariably made the same defiant claim.

“I’m not here to make friends.”

For months, Juzwiak pulled together as many clips as he could find featuring the phrase, a process that required searching numerous recaps, and frequenting dubious video archives. “It was not the age of streaming,” Juzwiak says. “It was the era of sketchy sites: ‘If I download this file, will it actually be the file, or am i giving myself a virus?’” Eventually, he compiled more than three minutes’ worth of examples, which he edited together and released on his personal blog in July 2008, under the title “I’m Not Here to Make Friends!”

At the time, Juzwiak’s video was one of the more ambitious—and more popular—examples of an emerging genre of web video: The supercut. It was a catch-all term for a new wave of fast-moving, detail-obsessed videos that isolated a recurring pop-culture trope. Some supercuts were specific to a movie or series, like a compilation of every ridiculous one-liner uttered by David Caruso on CSI: Miami. Others cataloged decades-old Hollywood cliches, like dramatically yelling “It’s showtime!”

Related Stories

The first supercuts had begun popping up on YouTube shortly after the site’s 2005 launch, but they achieved web-wide awareness a decade ago, not long after writer and net-culture observer Andy Baio first coined the term “supercut” in April 2008. A few months later, “I’m Not Here to Make Friends” arrived, followed by such crucial supercuts as people saying “What? on Lost and an experimental clip in which all of the words were removed from George W. Bush’s 2008 State of the Union address. For years afterward, supercuts would become a staple of culture blogs and movie and TV sites, documenting everything from Schwarzenegger screams to Nic Cage freak-outs to Bill Gates saying “uh” a lot.

The length and reach of these classic ‘cuts varied, as did their intent. Some were merely having fun, pointing out ridiculous, overused catch-phrases; others were serving as a sly bit of cultural commentary. A few of the best supercuts, like “I’m Not Here to Make Friends,” were both. “It was about exposing the tropes,” says Juzwiak. “One of my obsessions—and maybe my chip on my shoulder—are things that treat viewers like they’re dumb. And when you see the pattern, you hit back at it.”

Ten years after the surge of the supercut, though, the era of the obsessive, affectionately critical montage appears to be over, done in by changing technologies, diminishing attention spans, and an exhaustion of ideas. As Caruso might say, it looks like the supercut has been…[pauses to put on sunglasses]…snipped.

In many ways, the supercut era could only have arrived in the mid-to-late ‘00s, a period in which the web had yet to begin its transition from quasi-lawless creative outpost to monetized, polarized aggravation machine. YouTube, Facebook, and Twitter were just a few years old; Instagram and Vine were still a few years away; and people were making memes and deeply creative clips largely just for fun. The idea of getting rich or famous from something you played around with in your spare time seemed LOL-worthy. “There was never any sort of endgame I thought of, beyond just completing a video,” says Juzwiak. “It was a period where there wasn’t really a sense of where things you made would end up, or how people would dissect them.”

The idea of chopping up and mocking pop culture in video form was hardly invented by the web, of course, as artists have been playing with on-screen imagery for decades. One of the forefathers of the modern supercut movement was Derrick Beckles, who in the ‘90s began slicing up bizarro late-night movies, commercials, and music videos into VHS-era compilations he released under the name TV Carnage. One of his videos, “Give Me Your Badge and Your Gun,” compiled various scenes of tough-as-nails police chiefs demanding that some upstart underling hand over his or her shield and weapon.

Years later, by the late ‘00s, iMovie and the recently introduced Adobe Premiere Pro had made it possible for video artists to put together montages with relative ease, even if they were learning on the fly. Juzwiak wasn’t a professionally trained editor when he made his supercuts—”I had more of a D.I.Y., ‘I know what I need to do to put this together’ approach,” he says—but neither were many of his supercutting peers. In 2009, Duncan Robson created “Let’s Enhance,” which compiled several scenes of TV and movie characters staring at ridiculous zoom-in technologies. He didn’t have any digital-editing experience, but using Premiere, he was able to put together the nearly two-minute-long clip from illegally ripped footage he’d found around the web. “It was all pirated,” Robson says.

What made clips like “Let’s Enhance” especially potent in the late 2000s was the fact that web users had the bandwidth to watch longer, more intricate home-made videos. There simply wasn’t quite as much content to absorb, and a five-minute supercut—like Juzwiak’s collection of horror-movie characters being stranded without cell-phone service—could earn hundreds of thousands of views. “The climate was different, and the technology was different,” says writer-producer Debbie Saslaw, who worked on supercuts for the humor site Slacktory. “Facebook wasn’t pumping out videos, and YouTube wasn’t a content repository. You could just push things out on Tumblr.”

Partly as a result, Saslaw notes, “Supercuts got tons of press.” Big supercuts like the now-taken-down “Mad Smoke: Every Cigarette Smoked in Mad Men got written up in multiple outlets. The supercuts that got passed around tended to zoom in on (and enhance) a specific phrase or behavior that viewers had long observed, but never quite articulated: Gee, Julianne Moore really does cry a lot! And Sean Bean really does die a lot!

For many creators, though, the joy in making a supercut wasn’t simply piling on as many or one-liners or moments as possible; it was trying to unite them all, visually and thematically. “They give you an opportunity to structure something in an interesting way, and to find little connections between things, other than the obvious ones,” says Robson, whose “Let’s Enhance” video pulls off a canny trick: Though it’s sourced from several prime-time detective shows and films, it ultimately feels as if every character is in the same room, staring at the same screen. Even if you haven’t watched CSI or Alias, you know the trope being mocked.

And if you are a fan, “Let’s Enhance” is a gentle reminder of these shows’ ridiculousness. A good supercut can be a critique, but it’s usually an unscathing one—another reason why they were so beloved for so long. “It’s way to comment on something, but not have your point of video be so didactic that you’re shoving it down people’s throats,” says Juzwiak. “And as somebody who writes criticism, I know it’s hard to reach people like that—or to make them even care.”

Within a few years of their ascent, supercuts became so well-established that the bigger media companies that had once frowned on them—even having some taken down for copyright violations—were now commissioning their own. Showtime hired Saslaw away from Slacktory, where she was reponsible for supercuts like a compilation of Claire Danes’ “Cry Face” or rounding up “Law & Order’s Fakest Websites”, to make network-sanctioned supercuts like cast members of The Real L Word saying “drama”. (That clip was inspired in part by “I’m Not Here to Make Friends”: “It was one of the first supercuts I’d seen,” Saslaw says. “Rich is one of my heroes.”)

By then, making supercuts had become far less labor-intensive, as supercutters no longer had to yank illegally duped clips from sites like Megaupload or the Pirate Bay. Everything was pretty much online, whether streaming or via direct download. Nowadays, says Saslaw, “they’re really easy to do. All you have to do is search a script database to find people saying a term, download the movies, rip and extract that clip, and put ’em all together.”

But the ease of making supercuts also led to a glut of clips that were far less effective than the ones that had initially criss-crossed the web in 2008. They’d begun to grow at once more completist and less focused, collecting scenes that may have shared some connection, but didn’t make a real point: As much work as it takes to put together something like “50 Heartbreaking Movie Moments,” it feels more like an all-inclusive montage than a specific supercut. The very term had become a trope. “I’ve seen ‘supercut’ used to describe videos that are just things edited together,” says Robson.

Yet perhaps the simplest reason why supercuts have become more rare is because, after 10 years, nearly every well-tested cliche and catchphrase has already been documented: If you’re watching an old movie and suddenly see something deserving of a supercut, it’s likely already been done.

And even when someone does put together an ingenious, hyper-focused video worthy of being called a supercut—like this year’s “Every Time Denzel Washington Has Ever Clapped”—the web is much more crowded than it was ten years ago. “Anything longform is probably going to be received with crickets today,” says Juzwiak. “You’re a lot better off posting that video of a dog cha-cha dancing on Twitter for 10 seconds than spending months research and editing an essay that’s commenting on popular culture.”

Still, the original supercutters occasionally return to the medium from time to time. Earlier this month, Juzwiak released a new supercut, drawn from TLC’s reality show Dr. Pimple Popper, in which the show’s host repeatedly compares patients’ discharge to food. Robson, meanwhile, is planning a Kickstarter for his next editing project, a massive video-game compilation that would essentially be a steroidal supercut, one inspired by Christian Marclay’s 24-hour film-montage The Clock.

And the form itself may be finding its way to a new generation. Saslaw notes that, in recent years, she’s seen a burst of quick, fan-created mini-montages—first on Vine, and then later Instagram. “It’s not quite cultural criticism,” she says, “but they’re so elaborate, because they’re all done on phones. They’re not super-elevated the way our supercuts were, but there is a supercutty component of it: It’s the same formula, and with the same appreciation, but with all of these crazy effects.”

There’s always a chance, of course, that the style of supercuts that took off a decade ago could return. “I have a huge archive of campy reality TV from 2005 to 2013,” says Juzwiak. And if the result steps on some toes? Well, you know why he’s here.


More Great WIRED Stories