"philosophy is dead"The other day I had a long discussion over the Stephen Hawking's (and by the way my number one hero) quote "philosophy is dead". As you can imagine, I was the only person in the crowd who thought Stephen Hawking was right! Hawking argues that science and particularly theoretical physics has advanced so much that philosophers weren't able to keep up with that.
One of the counter arguments I was hearing was that philosophy never meant to answer questions about the world but to raise questions that people haven't thought about before. Even if you reduce the role of philosophy to that still I would say this is not a good argument. People who have expertise in a field are the ones who can ask the right questions. A person who doesn't have the basic knowledge about the laws of physics can be easily misguided, and end up with poor explanations of natural phenomena; in that sense a philosopher will be no better than a priest.
"we are no more than biological machines."Over and over I hear from people outside the science world, and amazingly also from some people inside the scientific community that they think humans have something that machines will never have. Here again I have to refer to Stephen Hawking's saying "we are no more than biological machines", and by machines it doesn't necessarily mean silicon based machines, but the important thing is that our behavior is determined by a dynamic model in a deterministic way.
Freedom is an illusion. We are as free as a computer program, the only difference is that we are way more complicated than any man made machine which have been built so far to replicate human behavior. But can we ever build an artificial intelligence which is as smart or even smarter than a human? the answer is absolutely! We have already made computer programs which can beat human performance in specific tasks, now, we only need to go a step further. Why haven't we managed to build such an AI so far? Firstly, the field is quite new; specifically the new branch of AI which is referred to as "machine learning", is quite new. Most of the tools that we are using now in machine learning community have been developed in last 20 years or so. Secondly, human brain has tremendous amount of computational power. That means, to be able to compete with humans, we need extremely fast machines. I'm not sure if we are there yet, but I think current generation of hardware is quite close. But I would say that's not the difficult part, even if we have a machine as fast as a human, we still need the model, the brain. What's nice about machine learning is that it enables you to build models based on data, but that requires time to gather the data and process it. We humans sure took our time to gather this data and natural selection did the processing for us. I'm talking about one billion years of evolution. Does it mean that we need one billion year to train a model as powerful as a human? I seriously don't think so. Natural selection is an effective completely unsupervised method. With small amount of supervision we can save a lot of processing time.