Yeah it's so shitty. I had an appointment in Taunton yesterday and the bus was so late and the detours so long because of the flooding that it took almost 3 hours to get there and when I did they told me they were shutting in 15 minutes and had to rebook. Took another hour waiting for the bus back plus an hour and half trip all while it was pouring down. So all in all a pretty ****ing awful waste of time.
Originally posted by The Ellimist
Anyone interested in futurist topics?
Originally posted by Stigma
I am actually, thanks for asking. Michio Kaku wrote a few interesting books on the subject.
It's kind of difficult to wrap your head around the logical implications of some of these technological developments. We're getting to a stage where these advancements can fundamentally alter the way that society operates on a qualitative level. What I mean is, from early civilization to modern times a lot of changes have been largely one of scale, but the analogy I could use is that you can still have a Civilization game where the same rules apply throughout and you just get more gold, more industry, etc. Only a few advancements like the Internet may require special factors in the model.
But then we get new input factors like virtual reality, which taken to its reasonable technological limits means that we have basically no need to interface with the outside world at all outside of maintaining said virtual realities. That would fundamentally change the way that the Civilization game is played. Likewise, with artificial intelligence, in addition to its massive safety issues, you basically obsolete humans with a replacement that is not only superior on a per unit basis, but scales so massively (e.g. can mass produce, jack up processing power, etc.) to the point where anything and everything is either trivial or impossible (e.g. not physically).
The other interesting factor to consider is that as our power as a civilization increases, the likelihood of both tail end outcomes (really bad and really good) increases too. Given that if we get to AGI we're either immune to all threats prior to the heat death or it kills us all, if we plot aggregate welfare vs. time almost all possible outcomes will explode to either extreme value. The only cases where you get some sort of non-extinction equilibrium state Star Wars style is if we get some sort of massive tech stagnation (like decreasing genetic intelligence outpacing the alluded to advancements to counter it).