A selection of 8 links for you. Use them wisely. I’m charlesarthur on Twitter. Observations and links welcome.
Michael Corkery and Jessica Silver-Greenberg:
The thermometer showed a 103.5-degree fever, and her 10-year-old’s asthma was flaring up. Mary Bolender, who lives in Las Vegas, needed to get her daughter to an emergency room, but her 2005 Chrysler van would not start.
The cause was not a mechanical problem — it was her lender.
Ms. Bolender was three days behind on her monthly car payment. Her lender, C.A.G. Acceptance of Mesa, Ariz., remotely activated a device in her car’s dashboard that prevented her car from starting. Before she could get back on the road, she had to pay more than $389, money she did not have that morning in March.
“I felt absolutely helpless,” said Ms. Bolender, a single mother who stopped working to care for her daughter.
At present, this story has 983 comments. People feel strongly about this topic.
Christian de Looper:
The car is an Audi SQ5 outfitted with Delphi’s tech, and has been tested on shorter drives in California and Nevada. Delphi believes the drive across the country will help it collect more insight and expects to collect a total of 2.3 terabytes of data during the trip.
News surrounding autonomous cars seems to be making headlines ever day. Tesla recently announced the next update to the Tesla Model S will allow the car to drive itself, despite the fact it is still unclear as to whether or not this type of technology is legal.
The Delphi car’s “brain” was developed in partnership with Ottomatika, which takes the data from the sensors during test drives and created a virtual environment for the car, which it uses to apply driving behaviors.
The trip itself will take eight days, and the car will not drive for more than eight hours per day. This will allow the car to complete the tip in daylight, stick to the speed limit, and keep the human passengers, who will make sure that everything runs smoothly, comfortable.
It’s important to note the car will only operate autonomously on highways, with human drivers taking the wheel once the car gets into a city.
It’s not Google’s software; from another article:
The software that interprets the data drawn from those systems and the algorithms that help the car make driving decisions were developed jointly by Delphi and Ottomatika – a company started by Carnegie Mellon University.
The frustrating thing is that Delphi’s own site which is meant to follow this – delphidrive.com – doesn’t have any useful information.
Harry McCracken tried it before it set off on its possibly unlicensed jaunt:
I’ve already spent enough time being driven around by autonomous vehicles (always with a human behind the wheel just in case) that at least some of the novelty has worn off. The fact that Delphi’s car drove itself pretty much like a human would have—stopping at safe distances at stop lights, switching lanes when necessary, and not doing anything which felt particularly robotic—didn’t startle me. But I was surprised by how normal the vehicle looked.
Unlike the Google car I’d rode in, there was no giant spinning lidar sensor atop the vehicle to tip off other motorists that this particular Audi SUV was anything unusual. It was well equipped with lidar, radar, and cameras, but they were unobtrusive—some of the gadgetry was even concealed behind the bumpers and license plate. The data collected by those sensors was displayed on the ordinary in-dash infotainment system rather than on specially rigged-up LCD screens. And the tech didn’t take up an out-of-the-ordinary amount of space, which I didn’t realize until after the trip was over and we popped the trunk, which was empty…
…Delphi isn’t working on self-driving as an exercise in futurism. It’s doing it because the car companies of the world are going to expect it to have competence in this field over the next few years. Delphi will need to be able to supply the necessary components, at a price and level of integration which makes sense for production vehicles.
There would be a strange irony if Google were to get outpaced in self-driving cars by all the other manufacturers.
Google stopped selling the first version of Glass and shut its Explorer program in January, moving the project out of its Google X research lab into a standalone unit. Ivy Ross remained head of the Glass team but Tony Fadell, head of Google’s Nest connected home division, now oversees strategy for the project.
The changes sparked speculation that Google will abandon Glass. However, Schmidt told The Wall Street Journal that it has been put under Fadell’s watch “to make it ready for users.”
“It is a big and very fundamental platform for Google,” Schmidt said. “We ended the Explorer program and the press conflated this into us canceling the whole project, which isn’t true. Google is about taking risks and there’s nothing about adjusting Glass that suggests we’re ending it.”
He said Glass, like Google’s self-driving car, is a long-term project. “That’s like saying the self-driving car is a disappointment because it’s not driving me around now,” he said. “These things take time.”
Which users, though? Consumer users? I don’t see it. Glass didn’t get consumer approval; instead it met direct and continued rejection. Industrial users, sure. There’s a use case there. But Google will quickly find itself competing with rivals – as the above link shows for self-driving cars.
Andrew Hoyle tried it out, and it’s the camera and battery where most of his complaints come. (For the rest, it’s a phone like many other metal-cased phones.) I noted this:
We don’t miss the M8’s duo-lens, which is no longer seen on the back of the M9. This extra sensor was designed to create unusual images with 3D effects. Sure, they were a bit of fun, but they were definitely a novelty and one that quickly wore off. We do miss a few other things, though. Despite incorporating the latest version of Android, it doesn’t incorporate all the new camera features, most notably raw support. It could also really use optical image stabilisation (OIS), which helps physically smooth bumpy shots; not only does OIS help at slow shutter speeds, but when you’re steadier there are fewer low-light artifacts (noise processing exacerbates the effect of camera shake).
The video looks acceptable, though you’ll really notice the jitter in bright light, when it chooses a fast shutter speed. Without image stabilisation, the combination makes the rolling shutter (that ugly wobble) look even worse. In low light, it suffers from the same lack of tonal range that’s in the photos.
HTC suggested last year that the duo-lens made sliced bread look a bit declassé. Now it’s dropped it. Ditto Samsung, with tons of features removed from the S6 compared to the S5. If you’re so sure a hardware feature matters for your flagship, why drop it after a year?
Google Fiber will sell ads in Kansas City tied to TV viewing habits » The Kansas City Star The Kansas City Star
your neighbor might see a different commercial than you while watching the same basketball game. And your kids, watching that game in another room, might see yet a different spot.
That super-narrow targeting represents something nearing a holy grail for television advertisers, even as it raises privacy issues about a company selling TV service tracking what its customers watch.
On a post to its online product forum on Friday, Google Fiber said the targeting “allows you to see ads for nearby businesses — like the car dealership downtown or the neighborhood flower shop.” It says it will start “a small trial” in early April. Kansas City will be the first market where the technology will be deployed — by Google or any cable company.
The practice won’t mean Google Fiber customers will see any more ads. Rather, like most cable companies, it will sell targeted spots replacing some national advertising.
Customers who don’t want those targeted ads, the company says, can change the settings on their TV boxes to opt out. But those who do nothing will see ads aimed at them based on their viewing behaviour…
…[Roger] Entner [who monitors the TV industry for Recon Analytics] speculated that the targeted ads might ultimately draw attention from federal regulators over privacy concerns. Think of someone who has friends over to watch TV. The targeted ads that appear during a show might give visitors insight to what that person watches when no one else is around.
“It can very quickly get to that creepy part of the equation,” he said.
The vulnerability was uncovered by Brandon Potter and JB Snyder, technical security consultant and founder, respectively, at security consulting and testing firm Bancsec. The two found that once they’d logged into a Hilton Honors account, they could hijack any other account just by knowing its account number. All it took was a small amount of changing the site’s HTML content and then reloading the page.
After that, they could see and do everything available to the legitimate holder of that account, such as changing the account password; viewing past and upcoming travel; redeeming Hilton Honors points for travel or hotel reservations worldwide; or having the points sent as cash to prepaid credit cards or transferred to other Hilton Honors accounts. The vulnerability also exposed the customer’s email address, physical address and the last four digits of any credit card on file.
Terrible, terrible testing.
Rachel Metz has previously tried Magic Leap’s AR system; now she’s trying Microsoft’s Hololens in its prototype stage:
I was not blown away by what I saw in Redmond. The holograms looked great in a couple of instances, such as when I peered at the underside of a rock on a reconstruction of the surface of Mars, created with data from the Curiosity rover. More often, though, images appeared distractingly transparent and not nearly as crisp as the creatures Magic Leap showed me some months before. What’s more, the relatively narrow viewing area in front of my face meant the 3-D imagery seen through HoloLens was often interrupted by glimpses of the unenhanced world on the periphery. The headset also wasn’t closed off to the world around me, so I still had my natural peripheral vision of the unenhanced room. This was okay when looking at smaller or farther-away 3-D images, like an underwater scene I was shown during my first demo, or while moving around to inspect images close-up from different angles. The illusion got screwed up, though, when it came to looking at something larger than my field of view.+
Microsoft is also still working on packing everything into the HoloLens form it has promised. Unlike the untethered headset that the company demonstrated in January, the device I tried was unwieldy and unfinished: it had see-through lenses attached to a heavy mass of electronics and plastic straps, tethered to a softly whirring rectangular box (Microsoft’s holographic processing unit) that I had to wear around my neck and to a nearby computer.