readenglishbook.com » Religion » Free for All, Peter Wayner [my miracle luna book free read .TXT] 📗

Book online «Free for All, Peter Wayner [my miracle luna book free read .TXT] 📗». Author Peter Wayner



1 ... 19 20 21 22 23 24 25 26 27 ... 59
Go to page:
software affordable."

The company grew by helping chip manufacturers tune the FSF compiler, GCC, for their chip. This was often a difficult and arduous task, but it was very valuable to the chip manufacturer because potential customers knew they could get a good compiler to produce software for the chip. While Intel continued to dominate the desktop, the market for embedded chips to go into products like stoves, microwave ovens, VCRs, or other smart boxes boomed as manufacturers rolled out new chips to make it cheaper and easier to add smart features to formerly dumb boxes. The engineers at the companies were often thrilled to discover that they could continue to use GCC to write software for a new chip, and this made it easier to sell the chip.

Cygnus always distributed to the Source their modifications to GCC as the GNU General Public License demanded. This wasn't a big deal because the chip manufacturers wanted the software to be free and easy for everyone to use. This made Cygnus one of the clearing-houses for much of the information on how GCC worked and how to make it faster.

Henkel-Wallace is quick to praise the power of publicly available source code for Cygnus's customers. They were all programmers, after all. If they saw something they didn't like with GCC, they knew how to poke around on the insides and fix it. That was their job.

"[GCC] is a compiler tool and it was used by developers so they were smart enough. When something bothered someone, we fixed it. There was a very tight coupling," he said.

He openly wonders, though, whether the average word processor or basic tool user will be able to do anything. He says, "The downside is that it's hard to transfer that knowledge with a user who isn't a developer. Let's say Quicken has a special feature for lawyers. You need to have a more formal model because the lawyers aren't developers. (We're fortunate in that regard.)"

That is, lawyers aren't schooled enough in the guts of computer development to complain in the right way. A programmer could say, "GCC is optimizing away too much dead code that isn't really dead." Other folks in the GCC community would know what is going on and be able to fix it. A lawyer might just say, "Quicken screwed up my billing and had me billing twenty-six hours in a day." This wouldn't pinpoint the problem enough for people to solve it. The lawyer doesn't understand the inside of the software like the programmer.

In situations like this, Henkel-Wallace believes that a corporate-style team may be the only one that can study the problems thoroughly enough to find solutions. Intuit, the manufacturer of Quicken, is well known for videotaping many standard users who use their product for the first time. This allows them to pinpoint rough spots in the program and identify places where it could be improved. This relentless smoothing and polishing has made the product one of the best-known and widely used tools on desktops. It isn't clear that non-programmers could have accomplished the same quality by working together with the Source at their disposal.

11.5 THE SOURCE AND THE LANGUAGE THAT WE SPEAK

..............................................

There are deeper, more philosophical currents to the open source world. The personal computer industry is only a few decades old. While it has advanced rapidly and solved many problems, there is still very little understanding of the field and what it takes to make a computer easy to use. This has been the great struggle, and the free source world may be an essential part of this trip.

Tim O'Reilly, the publisher of many books and a vocal proponent of the open source world, says, "We've gone through this period of thinking of programs as artifacts. A binary object is a thing. Open source is part of thinking of computers as a process." In other words, we've done a good job of creating computers you can buy off the shelf and software that can be bought in shrink-wrapped boxes, but we haven't done a good job of making it possible for people to talk to the machines.

To a large extent, the process has been a search for a good language to use to communicate with the computer. Most of the recent development followed the work at Xerox PARC that created some of the first graphical user interfaces. Apple followed their lead and Microsoft followed Apple. Each bought into the notion that creating a neat picture representing the files on a screen would make a neat metaphor that could make it easier for people to interact with the computers. Dragging a file to the trash was somehow easier for people to do than typing a cryptic command like "rm."

In the 1980s, that sort of graphical thinking was considered brilliant. Pictures were prettier than words, so it was easy to look at the clean, pretty Macintosh screen and think it was easier to use just because it was easier to look at.

But the pretty features merely hid a massive amount of complexity, and it was still hard to work with the machines. Don Norman, a human/computer interface engineer at Apple, once wrote a fascinating discussion of the company's design of their computer's on-off switch. He pointed out that the switch couldn't be a simple power switch that could cut the power on and off because the computer needed to orchestrate the start-up and shutdown procedure. It needed to close up files, store data safely, and make sure everything was ready to start up again.

The design of the power switch was made even more complicated by the fact that it was supposed to work even when the computer crashed. That is, if bad programming jumbles the memory and screws up the central processor, the power switch is still supposed to shut down the machine. Of course, the computer couldn't even add two numbers together after it crashed, so it couldn't even begin to move through all the clerical work necessary to shut down the machine. The Macintosh on which I wrote this book can crash so badly that the power switch doesn't work, and I can only reset it by sticking a paper clip into a hidden hole.

Norman's work shows how hard it can be to come up with a simple language that allows humans and computers to communicate about a task that used to be solved with a two-position light switch. This problem can be seen throughout the industry. One computer tutor told me, "I am so tired of telling people to shut down their computers by pushing the 'Start' button." Microsoft Windows places all of the features on a menu tree that grows out of one button labeled "Start." This may have been a great way to capture the potential to do new things that they felt they were selling, but it continues to be confusing to all new users of the machines. Why should they push start to stop it?

The quest for this Source-level control can take many strange turns. By the middle of the 1980s, programmers at Apple realized that they had gone a bit too far when they simplified the Mac's interface. The visual language of pointing and clicking at icons may have been great for new users, but it was beginning to thwart sophisticated users who wanted to automate what they did. Many graphics designers would find themselves repeatedly doing the same steps to image files, and it was boring. They wondered, why couldn't the computer just repeat all their instructions and save them all that pointing and clicking?

In a sense, the sophisticated Mac users were looking for the Source. They wanted to be able to write and modify simple programs that controlled their software. The problem was that the graphical display on the Mac wasn't really suited to the task. How do you describe moving the mouse and clicking on a button? How do you come up with a language that means "cut out this sample and paste it over here"? The actions were so visual that there weren't any words or language to describe them.

This problem confounded Apple for the next 10 years, and the company is slowly finishing its solution, known as AppleScript. The task has not been simple, but it has been rewarding for many who use their Macintoshes as important chains in data production lines. Apple included instructions for moving icons to locations, uploading files, changing the color of icons, and starting up programs with others.

The nicest extension was a trick that made the AppleScript "recordable." That is, you could turn on a recorder before stepping through the different jobs. The Mac would keep track of your actions and generate a program that would allow you to repeat what you were doing. Still, the results were far from simple to understand or use. Here's a simple snippet of AppleScript code that will select all files in one directory with the word "Speckle" in their title and open them up with another application:

This Source can then be run again and again to finish a task. Making this tool available to users has been a challenge for Apple because it forces them to make programming easier. Many people learn AppleScript by turning on the recording feature and watching what happens when they do what they would normally do. Then they learn how to insert a few more commands to accomplish the task successfully. In the end, they become programmers manipulating the Source without realizing it.

O'Reilly and others believe that the open source effort is just an extension of this need. As computers become more and more complex, the developers need to make the internal workings more and more open to users. This is the only way users can solve their problems and use the computers effectively.

"The cutting edge of the computer industry is in infoware. There's not all that much juice in the kind of apps we wrote in the eighties and nineties. As we get speech recognition, we'll go even more in the direction of open source," he says.

"There are more and more recipes that are written down. These are going to migrate into lower and lower layers of software and the computer is going to get a bigger and bigger vocabulary."

That is, more and more of the Source is going to need to become transparent to the users. It's not just a political battle of Microsoft versus the world. It's not just a programmer's struggle to poke a nose into every corner of a device. It's about usability. More and more people need to write programs to teach computers to do what they need to do. Access to the Source is the only way to accomplish it.

In other words, computers are becoming a bigger and bigger part of our lives. Their language is becoming more readily understandable by humans, and humans are doing a better job of speaking the language of computers. We're converging. The more we do so, the more important the Source will be. There's nothing that Microsoft or corporate America can do about this. They're going to have to go along. They're going to have to give us access to the Source.

PEOPLE

When I was in college, a friend of mine in a singing group would often tweak his audience by making them recite Steve Martin's "Individualist's Creed" in unison. Everyone would proclaim that they were different, unique, and wonderfully eccentric individuals together with everyone else in the audience. The gag played well because all the individualists were also deeply committed to living a life filled with irony.

The free source world is sort of a Club Med for these kinds of individualists. Richard Stallman managed to organize a group of highly employable people and get them to donate their $50+-per-hour time to a movement by promising complete freedom. Everyone who showed

1 ... 19 20 21 22 23 24 25 26 27 ... 59
Go to page:

Free e-book «Free for All, Peter Wayner [my miracle luna book free read .TXT] 📗» - read online now

Comments (0)

There are no comments yet. You can be the first!
Add a comment