Dear friends of the Singularity Institute,
It's been just over one year since I took the reins at the Singularity Institute. Looking back, I must say I'm proud of what we accomplished in the last year.
Consider the "top priorities for 2011-2012" from our August 2011 strategic plan. The first priority was "public-facing research on creating a positive singularity." On this front, we did so well that SI had more peer-reviewed publications in 2012 than in all past years combined (well, except for the fact that some publications scheduled for 2012 have been delayed until 2013, but you can still download preprints of those publications from our research page).
Our second priority was "outreach / education / fundraising." Outreach and education was mostly achieved through the Singularity Summit and through the new Center for Applied Rationality, which was spun out of the Singularity Institute but is now its own 501c3 organization running entirely from its own funding. As for fundraising: 2012 was our most successful year yet.
Our third priority was "improved organizational effectiveness." Here, we grew by leaps and bounds throughout 2012. Throughout the year, we built our first comprehensive donor database (to improve donor relations), launched a regular newsletter (to improve public communication), instituted best practices in management and accounting throughout the organization, began tracking costs and predicted benefits for all major projects, started renting a new office in Berkeley that now bustles with activity every day, updated the design and content on our website, gained $40,000/mo of free Google Adwords directing traffic to SI web properties, and more.
Our fourth priority was to run our annual Singularity Summit. We were pleased not only to run our most professional Summit yet, but also to subsequently sell the Summit to Singularity University (SU). We are confident that the Summit is in good hands, and we are also pleased that SU's acquisition of the Singularity Summit provides us with some much-needed funding expand our research program.
That said, most of the money from the Summit acquisition is being dedicated to a special fund for Friendly AI researchers, and does not support our daily operations. For that, we need your help! Please contribute to our ongoing matching challenge, which ends January 20th!
Onward and upward,
Winter Matching is 80% Complete!
Our Winter Matching Challenge ends on January 20th, 2013. We've raised $93,000 of our $115,000 target, so we're 80% of the way there! Remember that every donation to the Singularity Institute made before Jan. 20th will be matched dollar-for-dollar, up to a total of $115,000!
Now is your chance to double your impact while helping us raise up to $230,000 to help fund our research program.
Please read our blog post for the challenge for more details, including our accomplishments throughout the last year and our plans for the next 6 months. Donate now.
Anna Salamon on Philosophy Talk
Former Singularity Institute researcher Anna Salamon recently appeared on the popular philosophy podcast Philosophy Talk, where she spoke with podcast hosts (and Stanford philosophers) John Perry and Ken Taylor about the likely impacts of advanced artificial intelligence.
An MP3 of the program can be found here. Below is a summary from the program page:
"The rapid advance of computer technology in recent decades has produced a vast array of intelligent machines that far outstrip the human mind in speed and capacity. Yet these machines know far less than we do about almost everything. Is it possible to have the best of both worlds? Can we use new technologies to create a hybrid intelligence that seamlessly integrates the vast knowledge and skills embedded in our biological brains with the vastly greater capacity, speed, and knowledge-sharing ability of our mechanical creations? John and Ken examine the prospects for transcending the biological limits of the human mind with Anna Salamon from the Singularity Institute for Artificial Intelligence. This program was recorded live at the Marsh Theater in Berkeley, California."
Anna Salamon worked as a research fellow at the Singularity Institute for Artificial Intelligence before leaving to become the Executive Director of the Center for Applied Rationality, an organization dedicated to fostering a community of aspiring rationalists through the teaching of advanced decision making. We are thrilled to see her spreading such rationality skills to the world!
An Appreciation of Michael Vassar
Michael Vassar resigned his position as CEO of the Singularity Institute (to join Panacea Research) more than one year ago, so a letter of appreciation for his contributions to SI is long overdue!
The delay does, however, allow me to thank him for one of his most publicly visible contributions, which was only completed recently. During his tenure with SI, Michael built the annual Singularity Summit into such a valuable cultural asset that Singularity University approached us in late 2011 to acquire it, a deal that closed one year later in December 2012. The deal provides SI with a sizable chunk of funding for our growing research program, and gives SU a key asset in the singularity brand space. What Michael achieved here can be thought of as the non-profit equivalent of a successful exit.
Even before this, Michael managed to nearly double SI's budget (from 2009-2011), helped get us featured in many top media outlets, and built relationships with several leading academics, e.g. philosopher David Chalmers (whose two papers on the singularity introduced the topic to mainstream philosophers), psychologist Gary Marcus (who now writes in mainstream outlets about AI safety), and physicist Max Tegmark (whose forthcoming book has a section on the singularity).
Still, Michael's most significant contribution may be the creation of a street-level movement where before there were mostly just internet discussions. In 2006-2007, Michael began to build a community in New York City, which has since grown into the largest hub of Less Wrong / Singularity Institute activity outside the Bay Area. Many early members of this community went on to become SI staff members (Carl Shulman, Amy Willey, Jason Murray), major SI supporters, or community leaders. After relocating to the Bay Area, Michael worked with Anna Salamon and Carl Shulman to launch SI's Visiting Fellows program, which essentially created the street-level Less Wrong / Singularity Institute community in the Bay Area, which has produced its own set of SI staff members, supporters, and community leaders.
Michael remains an active SI Board member and supporter.
Thanks, Michael, for your dedicated work on behalf of the Singularity Institute! We wish you the best of luck with Panacea Research.
An Appreciation of Amy Wiley
Toward the end of 2012, SI's Chief Operating Officer and Singularity Summit organizer Amy Willey got married and moved to Michigan, where her husband's company (Stik.com) is now located. Due to her change of location and the sale of the Singularity Summit, Amy will no longer be working with SI in 2013.
Amy organized the Singularity Summit in 2010, 2011, and 2012, and is thus (in addition to Michael Vassar) the other person chiefly responsible for building the Summit into the top-notch event it is today.
As our Chief Operations Officer for several years, Amy was a dependable and trustworthy member of our team, and she played a significant role in growing SI into a mature organization that "has its act together," follows organizational best practices, and so on.
Thanks, Amy, for your dedicated work for the Singularity Institute! We wish you the best of luck on your future adventures.
Luke Muehlhauser Speaking at Stanford in February
Luke will be speaking on the Leonardo Art/Science Evening Rendezvous at Stanford on February 6, 2013.
Here is the abstract of the talk:
Superhuman Artificial Intelligence: Promise and Peril
Technological revolutions shape our world more than anything else, and superhuman AI will be the most transformative technological revolution of all. But will this revolution be positive or negative? Will it be more like modern medicine or the atom bomb? Several considerations suggest that superhuman AI will, by default, have negative effects on humanity. To ensure that superhuman AI impacts us positively, we should invest in AI safety research today, so that AI safety research outpaces AI capabilities research.
Doug Wolens' Documentary "The Singularity" Now Available on iTunes
Filmmaker Doug Wolens has been working on his documentary "The Singularity" for five years, and now it's finally out on iTunes! Interviewees in the film include Ray Kurzweil, Eliezer Yudkowsky, David Chalmers, Bill McKibben, Marshall Brain, Richard A. Clarke, and many others. Learn more about the film by reading the director's statement.
Featured Volunteer: Frank White
This month, we thank Frank White for his volunteer work on the French translation of Facing the Singularity. Frank first came across the Singularity Institute while searching for papers in the field of artificial intelligence. Translating works on the Singularity can be a challenge due to all the specific and scientific terms involved, some of which may not have direct analogues in other languages. Rather than using new words in the middle of the text, which can be jarring to readers, Frank opts for rewording the original text to keep a smooth flow in the translation. Thank you, Frank!
You, too, can sign up to volunteer for the Singularity Institute at singularityvolunteers.org.
Featured Summit Video: Jaan Tallinn: "Why Now? A Quest in Metaphysics"
Our featured Summit video this month is a mind-bending talk — with great animations! — by Jaan Tallinn called Why Now? A Quest in Metaphysics. Here's the abstract:
The word "singularity" usually denotes something exceptional, a situation that breaks a given model. It therefore seems like an incredible coincidence that we were born just decades before an imminent technological singularity that threatens to break our model of the evolution of the entire universe. What if that incredible coincidence is merely an illusion though -- what if our model is not correct to begin with? The talk combines intelligence explosion, multiverse, anthropic principle and simulation argument into an alternative model of the universe — a model where, from the perspective of a human observer, technological singularity is the norm, not the exception.