Friday, 11 March 2016

BILL GATES SPEECH AT STANFORD UNIVERSITY 2008


Bill Gates Speech at Stanford University
Bill Gates College Tour 2008
Stanford University

Well, it's great to be back here at Stanford. You may know that Microsoft's CEO went to Stanford, but I induced him to drop out. So, he never got a degree from Stanford. He did get an undergraduate degree, but I still think of him as a fellow dropout. (Laughter.)

We have lots of great people who've come from Stanford: Rick Rashid, who runs our research; Chris Jones, tons of great people, so we owe a lot to the school.

And the collaboration that's going on today, whether it's between Microsoft and Stanford in areas of software advances or between my foundation and Stanford on global health things, really are fantastic.

I spent the afternoon meeting with faculty, talking about the progress on those things, and sharing our ideas about where we go from here. Really it's exciting because my optimism about technology is deeply underscored when I meet these brilliant researchers, and see that they are going to get the resources, and take on these ambitious goals.

John mentioned that the middle of this year is a change for me, that I'll switch to being full time at the foundation, and part time at Microsoft, and that could be traumatic for me. I was 17 years old when I started working full time on Microsoft work, and I've done it basically every day of work since then. So, who knows what it will be like to make the change? I'm looking forward to it, and some friends said that they'd like to volunteer to help make a little video so that I'd understand what my last day will be like, and how things will change. So, let's take a look at the video they helped make.

BILL GATES: Well, we certainly had a lot of fun making that, but the transition is going very well. You saw Ray Ozzie and Craig Mundie in the video, who are taking over a lot of the things I've been doing. And I'll still be very involved in some things I've had a lot of passion about, including natural user interface, some things about how we structure knowledge, and really take on the big frontiers in software.


Let me talk now about what I think software will do in the decades ahead. Certainly if you go back to the start of Microsoft, nobody thought of software as being important at all. There was no software industry. The little software that there was, was simply bundled along with the mainframe for the very expensive computers. And computing itself was almost thought of as a threatening scary thing, where governments and large companies would use it to track information about you and to print bills that were never right. People talked about stapling the punch cards that came with your bill, if you've ever heard of a punch card, and messing up these evil computers.

And so it's a real mind shift change to say that computing was going to be about individuals, that it was going to be about empowerment, and that even more important than the hardware would be the software that was available, and that a gigantic industry would group up around that.

Now, that dream required some heroic assumptions. We had to believe that the cost of the hardware would come down. We had to believe that the volume would go up. And only then would the economics of being able to spend tens of millions of dollars to write a software package, and yet being able to sell it for say $100 or less, actually make sense.

And so we undertook the idea of reaching out to other people, getting them to start software companies, and making sure that the personal computer became that high volume platform. In fact, today the software industry is gigantic, and the range of solutions and creativity in that industry is absolutely phenomenal.

That's really changed the way we think about computing. Today, we think about computing as affecting almost everything. Ten years ago, I talked about the start of the first digital decade. That's about the time where the Internet was just showing up, and nobody was doing their photography in digital form or banking online or organizing their trips or looking at stock results. Well, today, 10 years later, many of those activities, certainly in the rich countries, we almost take for granted. The idea of a printed phone book or a CD or a record almost seems antiquated.

My daughter doesn't know what a record is. I keep meaning to go find one and show her, but they're hard to find nowadays. Soon enough things like the phone book or a print-based encyclopedia will be equally antiquated.

So, we're now at the start of what we call the second digital decade, and I think the changes, the impact of this second digital decade will be far more dramatic than the first; in fact, as dramatic as all the things that software has done in this entire 30-year period since the personal computer came along.


Part of that is because of the foundation we have. We have over a billion personal computers out there, and several billion people who've had a chance to use those. We have several billion people who use cell phones. We have somewhat less, about 300 million people connected up to the broadband Internet, but that's a number that keeps growing quite dramatically.

We reached an interesting milestone just recently where China now has more broadband users than the United States. You can be sure that the United States won't catch up, because China has a lot of people who are going to be connecting up. In fact, of all the IT related markets, the personal computer itself and software are the only ones where the U.S. market is still much bigger than the market in China. So, it's a very, very global business in terms of where the talent is, where the innovation is, where the markets are, and different ways of using these tools to have an impact.

I think it's fantastic that the Internet has made the world a smaller place. The growth of the personal computer from a device that you create documents and you edit them to one where you can do a little bit of e-mail to one where you could get a little bit of content to now where almost everything should be digital by default, that is a big mind shift change.

For any industry it has huge implications. Even for education, which we think of as being about the same for the last several hundred years, if somebody said to you, the best math teacher lived in 1890, you could say, well, maybe that's true. You couldn't say that about the best person who understands physics because there's been an accretion of knowledge, and people are building on each others' understanding.

In some areas the ability to watch people who practice very well, to see their results that are numerically analyzed, to understand what those techniques are, it's been difficult to create that learning cycle. And now that we have things digitally, that we can store videos digitally, that we can look at test scores and correlate things, and other teachers can see what those teachers are doing and try out those best practices and find out how well they're working, then even in that area we get a fairly substantial change.

Education will probably split into different things. For example, how many universities should have to give lectures on subjects like physics? Well, the answer is very few, because whoever does that well can put it out on the Internet, make it available for free -- and there's certainly a trend towards doing this -- and everybody the world over, assuming it's localized into different languages, which again using labor across the Internet again should be a very straightforward thing that even volunteer labor in most places would be able to achieve, then you have the best teachers of all the college curriculum available on a worldwide basis.

I personally go up and watch courses about physics or chemistry or anything that I want to know about -- I have to admit they're mostly MIT courses at this point, but I look forward to seeing more Stanford courses up there -- and I feel so privileged to be able to do that.

Not a week goes by that my children aren't asking me some question that I go up to the Internet and say, okay, what is it about stars or different animals that I can take a story back and actually be a dad who knows the answers to these questions, and encourages their curiosity. None of that would have been possible before.

The ambition level we can have for different realms of activity should be much higher, and it's because of what software can achieve. If we think of somebody who works in an office, today they are really information starved: their ability to navigate, to understand customer trends and quality and costs and opinion, even to survey information and look at how that's changing over time, look at key indicators that make sense to them, to collaborate with people at a distance. You can just talk to these what we call information workers about how valuable their time in meetings is spent, how hard they find it to get data, and you understand they are not yet fully empowered.

The way that even just communications works, where they think about phone numbers and busy signals, and shall I send e-mail or instant messaging, the way that when you're at a distance you can't really meet and collaborate in a rich way, it is very, very antiquated. It's way better than it was say 10 or 20 years ago, but it's nowhere near to what it can be.

When you start to take products like cars and planes or any physical product, do the design digitally, share those plans around, let people try out simulation models, what might happen with that product over a period of time, you're shortening design cycles.

So, you take the fact that there are more educated people on a global basis, that they're connected, and that the power of software will give them better tools, not just to work together but also to model and understand the nature of the product work they're doing, then innovation will accelerate, and it will accelerate on this foundation of the advances in computing and software.

Why can we be so sure about this? I mean, after all, when Gordon Moore first predicted that the number of transistors would double every two years or so, it was just a prediction.

Well, we can see that that prediction in transistors will remain true for the next 10 years. We can see that the storage capacity will also grow at exponential rates, that optic fiber bandwidth will go up at exponential rates.

If we look for bottlenecks in this, we only see them in a few places. We see a bottleneck in terms of clock speed of the microprocessor. So, finally we have to deal with programming computers to work in a parallel fashion. It's one of those great problems that when I was in computer science we thought, hey, maybe we're about to solve this. Well, now we really have to solve it. The brute force of clock speed scaling is not likely to come and bail us out the way that it has in the past.

There are some issues of modularity and proving programs correct both for just reliability and security things that are also now very required as we're taking all of society and connecting it up digitally, with financial records and product orders and private medical data, all of these things being stored digitally.

So, the basic foundation of how we understand the way that software works to say does this software maintain privacy, does this software control this information properly, there are some fundamental advances in computer science that we need to drive.

Most of the things we want -- cheap screens that for US$30 or $40 you can take every wall in your room and project up something at very high resolution; cameras that when combined with software can recognize the kind of gestures that you make, and who's in the room doing what -- these things will be very inexpensive.

And, in fact, we're on the verge of a big change of how you interact with all these devices. It's been the mouse and keyboard overwhelmingly. It was just the keyboard, but then the mouse became mainstream, actually invented not far from here in the '60s by Douglas Engelbart, but then with graphics interface that came and it was standard. That is the way we interact: You sit down at a chair, it's really just one person.

You're starting to see the beginnings of a change to a broad range of interaction techniques I call natural user interface. You see it in the 3D controller that the Wii has.

You see it in the touch that the iPhone has.

You see it in products like Microsoft Surface where we have cameras that can look at any gesture, any object that's appearing, and seeing what you're doing.

You see it in RoundTable that sees who's in the room and decides who's speaking by taking these multiple camera feeds.

You see it in products like the TellMe software that runs in mobile phones where instead of trying to use that keyboard, you simply say what you're interested in, whether it's a directory lookup or a software interaction, and it recognizes that speech.

We now have the power to perform natural user interface.

A form factor that I'm a big believer in, that I'm excited to make sure we keep investing in, to drive it so it's attractive to the mainstream, is the tablet device. This is where you can read off the screen, that it's light, cheap, long battery life; eventually a replacement for paper-based textbooks.

My daughter goes to a school where they use that Tablet PC, and they use the pen, and they're very adept at it, and it's amazing to see how they kind of learn in a different way, because they have that tool.

There's still a lot of work to be done to get that down to the say several hundred dollars and the lightness and battery life that we need, but that is absolutely coming. It's a fundamental tool that will change the consumption of learning material, and even in the office place will be the device that you have as you go off to meetings.

We now talk not just about computers on the desk but computers in the desk, because we can recognize what you're putting there, and let you touch and expand things. Your desk will just be a horizontal surface display, your whiteboard will just be a vertical surface display. So, the ability there to take business information or project schedules and touch and manipulate and see those things, and then have a portion of it that's a videoconference with another person where you're working together and interacting, that will just be commonplace. When that's cheap, people will go to that, and we need a whole new generation of software that can interact and use those things.

In the consumer space experiences like TV, which are very passive today, very channel oriented, will change to be very personalized and very interactive. The dichotomy of broadcast video that you have say through your cable or satellite provider, and the video on the Internet, those will be brought together, so that if I have a child who's in a sports even and somebody with an HD camera just happens to go and film that, when I go back to my TV menu, based on my interests, that will show up as one of the top big choices that I might be interested in. There's nothing that will divide those two worlds. The advertising will be targeted, the shows will be interactive. Something like watching the Olympics and picking which sports you're interested in, or the election and seeing the background and the breakdown of what's going on with different votes; we are so used to a very limited TV experience, that this revolution that is literally on the verge of happening, people don't really appreciate how dramatic that's going to be.

Today, there's a few million people that are getting their popular mainstream video through pure Internet feeds that can be individualized, and so that infrastructure is starting to get out there. As we get it to the tens of millions and hundreds of millions, then all the content programmers will realize that the dividing line between what's kind of a set-top box in this environment and what's been a videogame, there is no dividing line; it's just a spectrum of content.

So, many of the things that will be available on TV in terms of watching together and chatting with your friends who are at a distance, or trying different things out, some of those you see more in the videogame world today than in the TV world, but we think they'll be very broadly adopted.

Things like organizing the memories of your children as they grow up, and having the images and the homework and the exchanges with them, and being able to go back and view that in rich and fun ways, that can happen very automatically.

Today, we're still very device-centric, and we rely on the user to move information between their phones, and their phones and their PCs, and their PCs and their PCs. Well, as we get this sort of unlimited power in the cloud, both in terms of computation and storage, the ability to move that data automatically so that if you buy a new phone your information just shows up, if you borrow a PC your data is there but only available to you, that will become commonplace.

So, the willingness to work with multiple form factors, even in the car where it's more voice oriented, or in the living room where it's more distance, 10-foot oriented with gestures and a simple remote control, or using your phone to control things, those experiences will not be bifurcated like they are today.

Now, we also need to revolutionize how we write software, where we can define things at a much higher level. That really hasn't changed much in these last 30 years. We're still writing declarative code that can take something like two banks whose products are 90 percent identical, and you can end up literally with a million lines of code that are different between these two banks. And yet if you describe say in English their products, you'd only find like 40 pages of difference.

And so you say, what is that explosion of complexity that is expensive, it's fragile, it's hard to prove it's correct? Well, it's a failure of abstraction. We have not changed that level of abstraction. And finally we have the computing power and some of these ideas that can create runtime environments that particularly in domains that you focus on like the business domain that so much software is written to, we can make some huge breakthroughs.

This is part of the reason why one of the best investments I think any company makes is in its research group and in the way that research group connects up with universities. It's something that Microsoft looked forward to doing, and about 15 years ago we were successful enough we were able to start down the research path. People like Nathan Myhrvold, Rick Rashid came in and built something really phenomenal that not just in terms of the research it does but in terms of the way it lets us understand the brilliant ideas at places like Stanford, it has made a huge difference for us.

Every one of our products is dramatically better because of that work. New things we do like ink recognition in tablet or all this visual recognition stuff that is just coming to the mainstream was totally developed there, our machine translation stuff. The breakthrough work that gives us the belief that we can take what search is today and so something that's dramatically better than that, that optimism comes because we have great people in our research group who are doing very advanced things.

We spend a bit over $6 billion a year on R&D, but it's really this long term piece that ranges from graphics techniques to quantum computing to natural user interface that really define what the future is going to be.

We've now spread that activity across the globe. When people come to me and say, hey, would you put a research center in a certain place, I used to say, well, if you have a billion people, we'll put a research center there, because we have one in China, one in India, one in the United States -- now we have three in the United States, so obviously I'm breaking my criteria that you have to have a billion people. So, it will be a little harder to exactly say the criteria, but it really has to do with where the top universities are.

That research activity is risk-oriented, and it's actually fairly surprising to me how little research is funded by businesses. Even here in the United States, if you take what Bell Labs and Xerox PARC did, which are some of the foundational work that Microsoft benefited from immensely, the entire personal computer industry had a huge boost by that work. And unfortunately those companies didn't get an economic advantage; the way they managed the research and thought about it actually set an example that may have set back the willingness of companies to make these investments. I think now you see a range of companies like GE coming around and saying that this is an important thing, but that's a huge challenge. Those are the kinds of jobs and breakthroughs that really are going to change the world.

Now, when we think about the sciences broadly, the role of software is becoming more important. In the past you could say, well, what was the language of science? You could say mathematics, and it was very important for physicists, chemists, biologists to have some understanding of particular parts of mathematics to express their ideas, to write down formulas, and to make predictions.

Today, the amount of data in most of these sciences is large enough that we can say that computer software and databases and pattern matching that come out of software breakthroughs are really important for what is going on in the sciences, particularly in biology, but I'd say almost as strongly for astronomy where the amount of data and taking a theory about the density of things, the creation of things, it's not just one telescope, it's not just being there at midnight and seeing something cool and writing it up and getting the Nobel Prize; rather it's deep analysis across massive amounts of data.

So, we are sort of the handmaiden of those advances, and making sure that we're reaching out and collaborating with the sciences, and understanding from them how do they want to process that genomic data, how do they want to take and get insights into it, that's very important.

We're doing our best to reach out to scientists, so getting ourselves out of just pure computer science, which is very important, lots of tough problems there, but to play a role in this more interdisciplinary activity that's happening in a very deep way in the top universities. In fact, in my discussions with faculty this afternoon I was really pleased to see how Stanford is really trying to push the limits of getting departments to work together, and particularly bringing in computer science.

One area of complexity that I'm sure fascinates all of us is studying the brain. There's a lot of great research going on in that. One of the people we're working with and providing software to is [Jeff] Lichtman at Harvard. So, I wanted to take just a quick look at a short video about how what he's doing, and then show how software fits into that.

(Video segment.)


BILL GATES: Yeah, I've got the HD View running right here, so you can get a little sense of it.

Processing lots of image data now it turns out with the right algorithms we can do this very well to let you scan in and out, and even apply a lot of recognition algorithms to understand. Here what we really want is a database of all the neuron connections inside the brain, and eventually understanding exactly what's being connected.

So, if we look here, this is the layer diagram, and I can go in and look at individual layers at any time, try and understand exactly what's changing as we go through that. A lot of data, but processed very quickly. Then here's where we take an algorithm that's trying to understand exactly what the patterns are, and then map that, as was being said, into those structures.

Now, obviously this is just the beginning of this type of capability to really get the model and understand what the meaning of the messages are all the way up to the highest level, that's going to take a lot of time, but that's a very software driven activity.

One thing that's amazing is in the computer industry and sciences broadly is how much students have really been at the heat of a lot of breakthroughs. John mentioned a lot of the great companies that Stanford alum or dropouts have started, and there are other examples as well. So, it's very interesting that at a young age people are very open-minded about new approaches.

We announced a new program today to actually let students have all the same tool software, things like Visual Studio or Expression, the same software for free that professional developers use, really trying to broaden that out, both to not only the computer science department where we've already had grant programs, but to the other departments and even down to a younger age level, so that this access to the very best tools is there from the beginning. Some of these people will go on and start companies, some will just be a lot better in whatever activities they engage in.

This level of interest is very high. We have a contest every year we call the Imagine Cup. Last year, it was about 100,000 students. This year it will be about 150,000 students. The United States is the third biggest country, where Brazil and India have a higher enrollment, but the U.S. at 15,000 is very significant. And the quality of these entrants are really unbelievable. In fact, we had a thing where people won a programming contest, we would just basically give them a job and some of the people who have come out of that have been really phenomenal in terms of what can go on.

And what we're seeing is we're really getting to the point where your level of education is what defines your opportunity. It's less about where you grew up and simply having access to these tools, if you're lucky enough to get access, then really the sky is the limit in terms of what can be done.

So, this brings us to my final topic, which is the question of as we have all these advances, how are benefits of those advances spread in terms of the richest 2 billion on the planet, say the middle 2 billion, and the bottom 2 billion?

In fact, our record to date is that although there's benefits in terms of improved medicine and food and electricity to a high percentage of people, that the relative benefit has been overwhelmingly to essentially the people who need it the least, where the marginal benefit is lower than it is say in the poorest 2 billion, where literally for not spending a few hundred dollars a child's life is lost.

Of the 12 million children that die every year, less than 1 percent of them are in the rich countries, and yet if you look at medical research that's related to those things, over 90 percent would relate to the conditions that are in the richest countries.

So, you have this big disparity. Consider how much money should be spent on baldness versus on malaria. Well, the ratio is about 50 to 1 for baldness. Malaria, of course, kills over a million a year.

I was pretty stunned when I found out about these statistics, and I have to say it was after I dropped out of Harvard, actually quite a bit, over 10 years, I read about a disease called rotavirus that was killing a half a million children a year, and I thought, what the heck is rotavirus; I'd absolutely never heard of it, this must be -- this article must be wrong. You can't have a disease that's killing a half a million children and not have had courses. I flipped through the course catalogue; I never saw anything, any of this stuff. In fact, the one medicine there was for that disease, the one vaccine was taken off the market because of things that really it shouldn't have been taken off the market for, for the key target market, which was the poor countries.

So, we have this disparity that as great as our system is, if there's not a market need, it doesn't drive the innovation to the particular requirements of the poorest.

And yet I think that's a very solvable thing, and, in fact, I think there's an increasing awareness, a desire of people working at companies, of companies, and of universities to have an impact that's measured slightly in an additional way besides what the pure market incentives are.

Our research group in India has a special group with a lot of social scientists in it that goes out to the poorest and is looking and talking to them, and very quickly you realize for that segment there's no electricity, there's widespread illiteracy; you're not going to give them a personal computer. I don't care if it's a 10 cent personal computer; the problem is very different than that.

So, some of the solutions they've come up with in terms of using cell phones or even just using DVDs have been amazing. They take these agriculture extension workers who go out and help farmers, tell them what to do, and they come with a TV set and a DVD, and the very best farmers have been filmed doing these things. Think of it as like American Idol, except this is "Farmer Idol" and they really want to be the ones to choose on the video. That technique has done more for improving the productivity of those farmers, it's three times as effective as just sending that person out, and yet they don't need to be nearly as trained. So, some technologies like a DVD player carried out to a village, when used in the right structure, can have a very dramatic impact.

So, at every tier -- the bottom 2 billion, the middle 2 billion -- we have to think through what technology can work. For vaccines, you have to keep them cold as they get out to these rural villages; that's a very tough thing.

One of the things I'll be spending time on is reaching out to both universities and companies, and encouraging them to get more involved in this: food companies on micro nutrients and the ideas they have about buying food and helping the small holder farmers who represent the majority of the abject poor in the world; the pharma companies in terms of doing more on these things.

I absolutely think universities have a big role to play here. One element of it is that I don't think students should graduate without having some sense, ideally both learning about it and having some direct experience of it, of the average human condition in the world, as opposed to the condition that we experience normally by living here in one of the very richest countries in the world.

So, I think we can apply ourselves to this. I don't think it requires a revolution, but it does require a focus, it requires some value system that gets expressed, and some measurement, both in terms of who's doing it well and who's not doing it well, that's really going to drive more rapid change.

So, overall I hope you get a sense of my optimism about how technology broadly and software in particular will become an enabling element in the years ahead. I think it's a wonderful time to be a student and to have gathered these skills, and so I'll be very excited to see the great work that you can do.

Thank you.

0 comments:

Post a Comment