If so my douchie friend from across the pond, you are a shill and you drive to my mothers house every 2 days a reset her unit so she can watch movies. I bought it for her because it was supposed to be simple and I relied on the WD name.
However, yesterday afternoon while I was at work my wife called and told me that the Instant Queue was no longer available. I told her to use the Wii instead and it worked fine. I performed the reset procedure I found here and got it working again.
im having this issue as well now. i wanted to know if everyone here has blu-ray added to their netflix accounts? i just added blu-ray and now im having this issue. looking forward to the firmware update. i have no other complaints except the long startup times when the wdtv starts up. its much longer than the normal wdtv live.
another thing for those of you who are havng probelms reaching the deactivate option within wdtv. make sure you start pushing the arrow sequence once the netflix window opens up and starts to load up the queue. u should see the circular progress indicator when you push the buttons. if it reaches the queue error screen you were too slow.
In 1999, Netflix had 2,600 DVDs to choose from but intended to grow its library to 100,000 titles. To make it easier for members to find movies, Netflix developed a personalized merchandising system. Initially, it focused on DVDs, but in 2007, Netflix launched streaming, which used the same personalization system.
Search. There was little investment in search in the early days as Netflix assumed members searched for expensive new release DVDs. But the team discovered that the titles members chose included lots of older, less expensive, long-tail titles, so they ramped up investment in search.
The high-level engagement metric was retention. However, it takes years to affect this metric. So Netflix had a more sensitive, short-term proxy metric: The percentage of members who rated at least 50 movies during their first two months with the service.
The theory: members would rate lots of movies to get better recommendations. Many ratings from a member signaled they appreciated the personalized recommendations they received in return for their ratings.
It took Netflix more than a decade to demonstrate that a personalized experience improved retention. But consistent growth in this proxy convinced the company to keep doubling down on personalization.
Over time, Netflix got better at suggesting similar titles for members to add to their queue, which drove this source from ten to fifteen percent of total queue adds. The QUACL was a great test environment for algorithm testing. In fact, Netflix executed its first machine learning tests within the QUACL.
CloudFront wasn't able to connect to the origin. We can't connect to the server for this app or website at this time. There might be too much traffic or a configuration error. Try again later, or contact the app or website owner.
Gib\u2019s note: Welcome to the 200 new members who joined since my last essay! After five months, we\u2019re 5,400 strong. In each essay, I draw from my experience as both VP of Product at Netflix and Chief Product Officer at Chegg to help product leaders build their careers. This is essay #50.
Netflix began as a DVD-by-mail startup, following the invention of the DVD player in 1996. In 1998 Netflix launched its website with less than 1,000 DVDs. Here\u2019s what the site looked like its first few years:
In twenty years, Netflix has gone from members choosing 2% of the movies the merchandising system suggests to 80% today. And the system also saves members\u2019 time. In the early days, a member would explore hundreds of titles before finding something they liked. Today most members look at forty choices before they hit the \u201Cplay\u201D button. Twenty years from now, Netflix hopes to play that one choice that\u2019s \u201Cjust right\u201D with no browsing required.
Below, I detail Netflix\u2019s progress from the launch of Cinematch in 2000 to 2006. It\u2019s a messy journey, with an evolving personalization strategy propelled by Netflix\u2019s ability to execute high-cadence experiments using its home-grown A/B test system.
Netflix introduced a personalized movie recommendation system, using member ratings to predict how much a member would like a movie. The algorithm was called Cinematch, and it\u2019s a collaborative filtering algorithm.
Netflix created a five-star rating system and eventually collected billions of ratings from its members. Netflix experimented with multiple \u201Cstar bars,\u201D at times stacking the stars to indicate expected rating, average rating, and friends\u2019 rating. (It was a mess.)
Dynamic store. This algorithm indicated if the DVD was available to merchandise. Late in the DVD era, the algorithm also determined if a DVD was available in a member\u2019s local hub. (By 2008, Netflix only merchandised titles that were available locally to increase the likelihood of next-day DVD delivery.)
Recognizing multiple family members used a shared account, Netflix launched \u201CProfiles.\u201D This feature enabled each family member to generate its own movie list. It was a highly requested feature, but only two percent of members used it despite aggressive promotion. It was a lot of work to manage an ordered list of DVDs, and only one person in each household was willing to do this.
Given the low adoption, Netflix announced its plan to kill the feature but capitulated in the face of member backlash. A small set of users cared deeply about the feature\u2014 they were afraid that losing Profiles would ruin their marriages! As an example of \u201Call members are not created equal,\u201D half the Netflix board used the feature.
The hypothesis: if you create a network of friends within Netflix, they\u2019ll suggest great movie ideas to each other and won\u2019t quit the service because they don\u2019t want to leave their friends. At launch, 2% of Netflix members connected with at least one friend, but this metric never moved beyond 5%.
Create algorithms and presentation layer tactics to connect members with movies they\u2019ll love. Use the explicit/implicit taste data, along with lots of data about movies and TV shows (ratings, genres, synopsis, lead actors, directors, etc.), to create algorithms that connect members with titles. Also, create a UI that provides visual support for personalized choices.
Why the 2011 dip in the metric? By this time, most members streamed movies, and Netflix had a strong implicit signal about member taste. Once you hit the \u201CPlay\u201D button, you either kept watching or stopped. Netflix no longer needed to collect as many star ratings.
The original petri dish for personalization was an area on the site with a \u201CRecommendations\u201D tab. But testing revealed that members preferred less prescriptive language. The new tab had a \u201CMovies You\u2019ll Heart\u201D tab that generated lots more clicks. The design team thought the tab was \u201Cfugly,\u201D but it worked.
Members \u201Cbinge-rated\u201D while they waited for their DVDs to arrive. The Ratings Wizard was critical in moving the \u201Cpercentage of members who rate at least 50 movies in their first two months\u201D proxy metric.
Netflix collected age and gender data from its members, but when the team used demographics to inform predictions about a member\u2019s movie taste, the algorithms did not improve predictive power. Huh?
How did Netflix measure predictive power? The proxy metric for the personalization algorithms was RMSE (Root Mean Squared-Error) \u2014 a calculation that measures the delta between the algorithm\u2019s predicted rating and a member\u2019s actual rating. If Netflix predicted you would like \u201CFriends\u201D and \u201CSeinfeld\u201D four and five stars respectively, and you rated these shows four and five stars, the prediction was perfect. RMSE is a \u201Cdown and to the right\u201D metric that improved over time, mainly through improvements in the collaborative filtering algorithm:
Unfortunately, age and gender data did not improve predictions\u2014 it did not improve RMSE. Movie tastes are hard to predict as they are very idiosyncratic. Knowing my age and gender doesn\u2019t help predict my movie tastes. It\u2019s much more helpful to know just a few movies or TV shows I like.
To see this insight in action today, create a new profile on your Netflix account. Netflix asks you for a few titles you like to kickstart the personalization system. That\u2019s all they need to seed the system.
The QUACL is the Queue Add Confirmation Layer. Once a member added a title to their Queue, a confirmation layer would pop up suggesting similar titles. Below, a member has added \u201CEiken\u201D to their movie list, and the collaborative filtering algorithm suggests six similar titles:
90f70e40cf