Archive for July, 2011

Foresight Links for Jul 18, 6:00 am

// July 18th, 2011 // Comments Off // Digests, Foresight

Foresight Links for Jul 11, 6:00 am

// July 11th, 2011 // Comments Off // Digests, Foresight

Foresight Links for Jul 4, 6:00 am

// July 4th, 2011 // Comments Off // Digests, Foresight

  • The Illusion of Control in a Intelligence Amplification Singularity, Michael Anissimov @ Accelerating Future:
    From what I understand, we’re currently at a point in history where the importance of getting the Singularity right pretty much outweighs all other concerns, particularly because a negative Singularity is one of the existential threats which could wipe out all of humanity rather than “just” billions. The Singularity is (Read more…)
  • 2011-04-22 04:40:33, Guest Admin @ Institute For The Future:
  • Singularity Summit 2011, Michael Anissimov @ Accelerating Future:
    The press release for SS11 is posted. Featuring Ken Jennings, Christof Koch, Tyler Cowen, Ray Kurzweil, and many others. The venue will be the same as 2009 — the 92nd St. Y in New York City. The theme we are pegging this year’s conference to is the Watson victory. (Read more…)
  • Replying to Alex Knapp, July 2nd, Michael Anissimov @ Accelerating Future:
    Does Knapp know anything about the way existing AI works? It’s not based around trying to copy humans, but often around improving this abstract mathematical quality called inference. I think you missed my point. My point is not that AI has to emulate how the brain works, but rather that (Read more…)
  • The Final Weapon, Michael Anissimov @ Accelerating Future:
    It’s not really “fair”, but history generally consists of people getting better and better weapons, and whoever has the best weapons and the best armies makes the rules. The number of historical examples of this phenomenon are practically unlimited. The reason America is respected and feared today is because of (Read more…)