Sunday, November 3, 2013

Healthcare.gov Likely Broken Until Key Thanksgiving Deadline


Healthcare.gov is going to be broken until the end of November, says Jeff Zients, a consultant brought in to fix the beleaguered federal health insurance e-commerce website. On a press conference call, Zients finally acknowledged wide-spread enrollment problems, estimating that only 3 in 10 users have been able to sign up and complete an application (or a measly 700,000 total). "We're confident by the end of November, HealthCare.gov will be smooth for a vast majority of users," said Zients. That estimate should scare the Obama administration: the end of November is dangerously close to the Thanksgiving deadline some experts say is crucial to snagging young, healthy consumers. “If it's not running by Thanksgiving, that's DEFCON 2,” warned MIT Economist Jonathan Gruber, who worked on both President Obama's and Gov. Mitt Romney's health care laws. "It's a real problem because people want to get insurance by January, but it's not a crisis.” Health and Human Services has dispatched an army of field salesman and celebrities to convince young “invincibles” to sign up for health insurance, which is needed to subsidize the costs of their elders. In Massachusetts, which has a similar individual mandate, enrollment numbers increased exponentially towards the end of the deadline. But, Massachusetts had twice as long and didn't have a funky enrollment deadline, where users had to sign up 3 months after the website launched to pick up insurance when the new plan started (for the Affordable Care Act, consumers must sign up by Dec. 15 to get insurance by Jan. 1). Zients says a government contractor that worked on the malfunctioning backend, QSSI, will be taking over the Center for Medicare and Medicaid as the lead developer. However, continuing the White House's trend of bizarre and abject secrecy, he would not reveal who is part of the “tech surge” to fix the website.

Bang With Friends Rebrands As ‘Down' To Match You With Friends Of Friends By Hotness


After settling a Zynga trademark infringement suit, Bang With Friends today rebrands as Down and reveals its revamped set of dating apps. Down's update lets you browse friends of friends rather than just friends, say you want to date as opposed to just being ‘down' [to bang] someone, as well as browse the hottest people in your network. And believe it or not, founder Colin Hodge says he wants Down to empower women. The updates for Bang With Friends' one million+ users come to Down on Android today, iOS soon, and the web in the coming weeks. At its heart, Down still an app for selecting people you think are sexy, hoping they choose you too, and then being connected over chat, similar to Tinder. But the new changes could make Down a vain curiosity for many and a daily habit for those on the prowl, rather than the rarely used utility Bang With Friends was. [We'll have screenshots and links ready as soon as they're available] A Kinder, Gentler Bang “There's still a stigma for using tech for dating. It's rapidly eroding but we still think it's crucial for our users to feel comfortable. That's part of the reason we rebranded”, Hodge tells me. That sure minimizes the fact that Zynga owned the “With Friends” trademark from its hit game series that include Words With Friends. But being forced to change its name could come as a blessing in disguise for Hodge's company. Launched in January 2013, Bang With Friends and its brash name and doggystyle logo (right) immediately started turning people off. It blew off the subtlety or purported focus on finding you a soul mate that led previous generations of dating apps to have names like Match, eHarmony, and OKCupid. It was about finding you someone to fuck right now. It takes two to tango, though, and women didn't seem so keen on joining anything called Bang With Friends. At the peak of its hype in the spring of 2013, a way to see which of your friends had installed the supposedly anonymous app surfaced. I wasn't too surprised when I saw a ton of dudes and essentially zero women had signed up. But “Down”? That's an app whose name you could bring home to mother [wait, eww, no]. But seriously, the term ‘down' is slang for ‘approve', and is much more inviting. That means the app has a better chance of recruiting women, which might just make it a success. Hodge writes “We chose DOWN to represent the simple, natural way that our generation dates, without alienating people who may not want an app that says ‘bang' but are totally down otherwise.” You can see the new app icon below. Exponentially More Matches Functionally, the biggest change from Bang With Friends to Down is that you can see friends of friends, not just your existing friends. The exponentially expands the pool of people you could be matched with. Friends of friends are less awkward to proposition than people you already know, yet the mutual connections provide a layer of trust that could convince people they won't get axe-murdered on date set up through Down. Hodge explains “One of the biggest requests was, ‘I love using the app but I'm running out of people.'” If you have 1,000 friends with 1,000 friends each, you could now have a million potential mates and the number keeps growing. “We're making it something you want to come into all the time.” For the tamer among us, down lets you say you're interested in dating someone, rather than only being allowed to request a hookup. You'll still be connected if one person wants to get down and the other wants to date, but your intentions will be made clear. Mirror, Mirror On The Web… What's most interesting may be Down's ‘hotness' scores. Previously, you'd just be shown a random selection or alphabetical list you could click to ‘bang' on web, or swipe to accept or reject on mobile like Tinder. On Down, you get intelligent recommendations based on a matching algorithm. It takes into a variety of characteristics include mutual friend count, but also your hotness score, which is based on what percentage of people who see you dig into your profile or say they're interested. This means you'll be matched with people “in your league”, who you're more likely to both approve of and get approved by. Relevant matches = happy Down'ers. When you're viewing someone, you're also shown a list of their 10 friends with the highest hotness scores. This could lead to a sort of Wikipedia-chain browsing pattern, where you browse to Amy, get temped by her friend Diana, only to end up saying you're ‘down' with Emily. Oh, and you can see your own hotness score and how you stack up against friends. Even if you're taken or not into online dating, I bet a fair number of people will sign up for Down just to peek at how desirable they are. There's still no sign of monetization to make good on the rumored but unconfirmed million dollars in funding Down has received. You can imagine it eventually going with standard dating app premium features, though, like the ability to pay to appear in front of more potential lovers. Down With A Purpose “We want this to empower both males and females to be straight forward with their dating life, whether that means sex or a more traditional commitment, so we very much dispute any cries of sexism” Hodge tells me, though it's tough to tell if he's earnest or if this the sweet talk of a player. Yes, Down is still inherently shallow. It's about looks first and foremost, which promotes a degree of objectification. It may still have trouble getting enough users of both sexes to make connections. It has to compete with the now-established Tinder plus its army of clones. And some people will always think it's gross. But from another perspective, maybe Hodge is right. We spend our lives beating around the bush when it comes to our sexual desires. Being shy, playing hard to get, denying our desires, and getting stuck in stale relationships. Maybe we deserve an app that lets us say how we really feel about someone without the fear of rejection.

Twitter Co-Founder Evan Williams' Blogging Platform Medium Opens Signups To All


Twitter co-founder Evan Williams has a new blogging platform called Medium, which has been a closed-signup affair since its introduction. Today, the platform sent out an email with the news that anyone can now sign in and start writing. There are a few requirements set out to sign up and use Medium. You must be writing from Chrome, Safari or Firefox browsers, and you must have a Twitter account to post. There's a verification link sent via email that you click on and then you're in. Posting is still not available from mobile devices. Williams founded Odeo - the parent company of Twitter - with Noah Glass after selling his company Pyra Labs to Google. The main product of Pyra was Blogger, which was one of the early products that codified what we now know as ‘blogs'. Now, Williams is in the blogging game again. Last month, Williams spoke to TC about his vision for Medium: "I think more people would be in a better place if more people shared their ideas," says Williams. Seen this way, Medium is just the next logical step in Williams' three-product cycle to inject better ideas into the world. Blogger helped open the doors for pajama bloggers to compete with the media moguls. A few years later, Twitter gave the power of broadcast distribution to everyone who had 140 characters to share. Now, to complete the circuit, Medium wants to make viral information more substantive - the hope in the Pandora's box of communication. "It's also an optimistic stance to say that we can build a system where good things can shine and get attention. And there's an audience for ideas and stories that appeal to more than just the most base desires of human beings." I've had access to Medium for a while and haven't used it a whole lot. But I did love the overall writing experience, which is clean and quick. It feels easy to dash off a post based around an image or zip some text in and hit publish. It's definitely far lighter weight than other options like WordPress, and has a lot in common with the publishing tools offered by Dustin Curtis' Svbtle network. Medium has managed to gain some relative popularity among a sea of other blogging alternatives, but not always for the best reasons. While there has been some interesting content, there have also been missteps like a false claim of government email snooping and Peter Shih's ‘10 things I hate about San Francisco' post. Topics like these have given Medium a rep for being an incubator for lack of self-awareness and inaccuracy. Still, Williams addressed those issues fairly plainly in his chat with TC. “Please don't set this up as Evan thinks tech blogs are crap and therefore is fixing them with Medium,” Williams told us. “People are going to publish crap on Medium.”

Huge Google Shift Points To Faster Search Results


Researchers at USC have stumbled on a huge change in how Google architects its search services. The result? Reduced lag in serving search queries, especially in more far-flung regions (as in, far from Google's own data centres). The insight into Mountain View's pipes stems from other research the team was doing to develop a new method for tracking and mapping servers, identifying when they are in the same data center and estimating where that data center is. The method also identifies the relationships between servers and clients, and - as luck would have it - the team happened to be using it when Google made its big move. Unless of course Mountain View makes such massive shifts regularly (which seems unlikely). According to the findings, over the past 10 months, Google has “dramatically” increased (by 600 percent no less) the sites around the world from where it serves client search queries (the animated GIF at the top of this post depicts this ramping up - with black circles being Google data centres, and red triangles being others' sites now being utilised by Google to relay search traffic). The researchers note: From October 2012 to late July 2013, the number of locations serving Google's search infrastructure increased from a little less than 200 to a little more than 1400, and the number of ISPs grew from just over 100 to more than 850. The USC team says Google has made this change by repurposing existing infrastructure - utilizing client networks it was already relying upon to host content such as videos on YouTube, and reusing them to relay - and crucially speed up - user requests and responses for search and ads. “Google already delivered YouTube videos from within these client networks," said USC PhD student Matt Calder, lead author of the study, commenting in a statement. "But they've abruptly expanded the way they use the networks, turning their content-hosting infrastructure into a search infrastructure as well." Previously search queries would have gone direct to a Google data centre, a network structure that could introduce an element of lag - based on how far from the data centre the query originated. The new architecture means searches go to a regional network first, and are then relayed on to Google's data centre. While that might sound more long-winded, it actually has the opposite effect, thanks to the continuous connection between regional node and Google data centres, keeping speeds up and helping to mitigate the effect of lost data packets. The researchers explain: Data connections typically need to "warm up" to get to their top speed – the continuous connection between the client network and the Google data center eliminates some of that warming up lag time. In addition, content is split up into tiny packets to be sent over the Internet – and some of the delay that you may experience is due to the occasional loss of some of those packets. By designating the client network as a middleman, lost packets can be spotted and replaced much more quickly. Google's new search architecture resembles the architecture of content delivery networks (CDNs) - such as Akamai and Limelight Networks - which are used to support video services to reduce lag when streaming content. How much lag is Google's new world order for search eliminating? Report author Ethan Katz-Bassett told TechCrunch that's difficult to assess at this point (the team is doing ongoing work to quantify the performance implications of the change), and said lag reduction will also necessarily vary “a lot” by region. But he described one example where search latency looks to have decreased by around a fifth. “To eyeball results from one machine in New Zealand, it used to get served from Sydney, and now it is directed to a frontend in NZ. As a result, it looks like the latency dropped by about 20%,” he said. “The high level implication is that many regions around the world that were previously somewhat underserved should receive faster performance,” he added. “For example, of the networks we see using these new servers, 50% were 1600+km away from their old server on Google's network. Now, half of them are within 50km of their new server in the local ISP.” The new infrastructure looks to be a win not just for users (getting faster results) and for Google (delivering more ads), but also for ISPs - because it should lower their operational costs since they are now serving more local traffic. And if Google is leaning more heavily on their infrastructure, it's possible Mountain View is paying them more too. Rather than the shift being about Google future-proofing for expected global growth in search queries, Katz-Bassett's view is this is about helping to serve existing users around the world better. “On its own, it doesn't necessarily aid capacity, but is probably mainly useful for improving performance,” he said when asked. Why has Google made this change now? Again, hard to say (Google isn't commenting on the research). Katz-Bassett speculates that there were engineering and technical challenges preventing it from routing search traffic this way before (more likely that than a lack of business partnerships, at least - since the study notes that Google is ‘mostly' utilising existing client networks, such as Time Warner Cable, for this new search topology). That and prioritising this change vs other performance improvements, said Katz-Bassett. “It does introduce some challenges: how should the system decide which server to direct a particular client to to get the best performance? In the past, Google controlled the whole path as soon as a request hit a frontend. Now that most of the frontend locations are outside Google's network, the frontends have to relay it over the public Internet (towards Google data centers), so I imagine the conditions vary more (congestion, available bandwidth, etc), and it is a very large system to manage,” he added. The USC team presented their findings at the SIGCOMM Internet Measurement Conference in Spain yesterday.

A Way To Save BlackBerry


The first smartphone I owned was a Nokia Communicator, which I chose because the C++ dev kit gave me the most freedom. When the iPhone appeared I did not switch, because mandatory App Store signing to execute code seemed like a major step in the war on general computation. Eventually I rid myself of Nokia and got an Android acting upon a moral imperative. Many hackers adhere to the ideology of Richard Stallman. We believe that the use of free software (that is software whose source can be viewed, altered and distributed by all of its users) is morally advantageous. We subject ourselves to “inferior” platforms in exchange for more liberty. Android is not free software, but has many free software components, so it is the most free for the time being. BlackBerry is a flickering candle about to be snuffed, but hope yet lies in the baptismal flame of liberty. Stallman himself refuses to carry a cellphone because none of them are free software and they have government back doors. His ascetic devotion to our cause is noble, but not realistic for those of us who find mobile devices irreplaceable tools for improving our incomes and sex lives. We less devout followers of the Church of Emacs settle for the most free platform in lieu of true freedom. I hate Android's UI/UX cesspool and Google's growing surveillance state. There are vast numbers of us sharing that sentiment, and all of us could be happy BlackBerry users. BlackBerry would merely have to perform a single revolutionary act: the liberation of mobile users and developers. Release every line of source under the GPLv3. Open up the specs for every hardware component and let the community build their own devices without NSA or corporate backdoors. Our gratitude and fealty will show themselves in BlackBerry's quarterly earnings reports. BlackBerry is a flickering candle about to be snuffed, but hope yet lies in the baptismal flame of liberty. With nothing left to lose, perhaps BlackBerry will have the courage to disrupt its competitors and world governments.

Pinterest Closes Another Copyright Hole, Inks A Deal With Getty Images, Will Pay A Fee For Metadata


With a fresh $225 million in its pocket, Pinterest is gearing up to spend a little of it to build out its platform and the data that powers it - and close up a copyright hole in the process. Pinterest today announced a deal with Getty Images - the image agency that holds digital rights to some 80 million still images and illustrations and over 50,000 hours of stock film footage. Getty will provide Pinterest with metadata, and in exchange, Pinterest will pay Getty a fee. Metadata will start to get added in the coming months, the companies say. Financial terms of the deal are not being disclosed, but offering a fee for image metadata is a first for Pinterest. Up to now, Pinterest has offered traffic to partners for more data - such as in the case of the recent article pins it introduced, or the Flickr deal from last year that added a Pin-it button to Flickr.com and Flickr backlinks for images posted on Pinterest. “As part of our agreement, we'll pay Getty Images a fee for the data they share and will help make sure that their images get proper attribution,” Pinterest notes today. “We're just getting started with Getty Images but we're excited about the possibilities of what their data can help us deliver.” Shareaholic recently noted that Pinterest is the second-biggest referrer of traffic on the Internet after Facebook, so for consumer sites based around advertising and (hence) traffic, this makes sense. But in the case of Getty, traffic is less important than the data it is able to provide about the images that it holds. And of course it has photographers and illustrators that want to be compensated for their images getting used. Getty says that it will be sharing the fee with its contributors (that is, those photographers and illustrators) - which means that this deal closes up another awkward copyright hole for Pinterest. As Getty Images co-founder and CEO Jonathan Klein laid out for us last year, this was something that the company was gearing up to address. “We're comfortable with people using our images to build traffic,” he said. “The point in time when they have a business model, they have to have some sort of license.” Pinterest, which is now starting to court more advertising and really focus on monetization, definitely fits into that category of now having a business model. For Pinterest, it will be able to use this data to provide more detail to its users about what they're looking at, including photographers' names and what's in the picture. But it sounds like Pinterest will also go much deeper with that data. For example, in the example of the picture of scallops with brussels sprouts above, Pinterest will be able to use the metadata to suggest more pins for recipes using those ingredients, or maybe more images from Thomas Barwick (the photographer). That will make it more likely that a user will spend more time on Pinterest looking at more content. And creating a stronger web of linked pins will also help Pinterest then monetize against that content better - more tags to match against relevant ads, and more of the all-important “engagement” that has become such an important metric for social media sites. While Pinterest these days sometimes leads people to “dead ends” when there is not enough information about a picture that has been shared - a fetching handbag, yes, but who is the designer? - this potentially will open up more avenues for users to travel further, so to speak (at least where Getty pictures are concerned). Getty says that it will be providing two pieces of technology for this service. The first is its PicScout image-recognition technology - which will crawl Pinterest to identify Getty Images. It will then link those images with Getty's metadata using its Connect API. “We'll get a photo credit for our images on Pinterest's site and a link back. Pinterest users get more context and have more fun,” Getty noted today. The service will go live first with Getty Images house content and will later expand to iStock content and other Getty collections.

Ask A VC: Google Ventures' Dr. Krishna Yeshwant On The Opportunities For Health-Focused Mobile Apps And More


In this week's Ask A VC show, we sat down with Google Ventures partner Dr. Krishna Yeshwant in the TechCrunch TV studio. Yeshwant is unique among most of the VCs we have on the show. Not only is he an investor, programmer and former entrepreneur, but he is also a practicing physician. Yeshwant, who is based in Boston, helped lead the firm's investments in a number of health companies, including Flatiron Health, Foundation Medicine and One Medical Group. We asked Yeshwant about what the major opportunities are in mobile health and diagnosis. He also commented on the new FDA guidelines for mobile medical apps.