This is an edited, updated version of an essay I wrote in 2008 when this now popular idea was embryonic and ragged. I rewrote it to convey the core ideas, minus out-of-date details, that I believe will be useful to anyone making things, or making things happen. If you still want to read the much longer original essay it will follow below this edited version. — KK
Escrito em outubro de 2006 por Paul Graham, esse Startup Mistakes by Paul Graham é um clássico que precisa ser lido e relido por quem vai empreender.
In the Q & A period after a recent talk, someone asked what made startups fail. After standing there gaping for a few seconds I realized this was kind of a trick question. It’s equivalent to asking how to make a startup succeed—if you avoid every cause of failure, you succeed—and that’s too big a question to answer on the fly. Continue lendo “Startup mistakes”
The long tail is famously good news for two classes of people; a few lucky aggregators, such as Amazon and Netflix, and 6 billion consumers. Of those two, I think consumers earn the greater reward from the wealth hidden in infinite niches.
But the long tail is a decidedly mixed blessing for creators. Individual artists, producers, inventors and makers are overlooked in the equation. The long tail does not raise the sales of creators much, but it does add massive competition and endless downward pressure on prices. Unless artists become a large aggregator of other artist’s works, the long tail offers no path out of the quiet doldrums of minuscule sales.
I recently attended an event with a large number of advertising executives. All of them are coming to grips with the change from the era of push media to the era of social media, which might more properly be called “pull media.” At its core, the social revolution allows people to consume what they want, when they want, and largely on the recommendation of friends and other non-professional influencers. Attempt to graft old models onto it and you are doomed to struggle; find models that are native to the medium and you will thrive.
At O’Reilly, we first learned this lesson in 1992, when we published The Whole Internet User’s Guide and Catalog, the first popular book about the Internet, and the first to cover the as-yet undiscovered World Wide Web. (When we published the book, there were only about 200 websites, and the first web conference which we convened, “the World Wide Web Wizards Workshop” had thirty attendees, albeit among them such later luminaries as Tim Berners-Lee and Marc Andreesen.) We had the great good fortune to hire Brian Erwin, formerly the head of activism for the Sierra Club, to help us with our PR and marketing.
By MARC ANDREESSEN
August 20, 2011
This week, Hewlett-Packard (where I am on the board) announced that it is exploring jettisoning its struggling PC business in favor of investing more heavily in software, where it sees better potential for growth. Meanwhile, Google plans to buy up the cellphone handset maker Motorola Mobility. Both moves surprised the tech world. But both moves are also in line with a trend I’ve observed, one that makes me optimistic about the future growth of the American and world economies, despite the recent turmoil in the stock market.
by Nicholas Carr
Norton, 276 pp., $26.95
In September 2013, about a year before Nicholas Carr published The Glass Cage: Automation and Us, his chastening meditation on the human future, a pair of Oxford researchers issued a report predicting that nearly half of all jobs in the United States could be lost to machines within the next twenty years. The researchers, Carl Benedikt Frey and Michael Osborne, looked at seven hundred kinds of work and found that of those occupations, among the most susceptible to automation were loan officers, receptionists, paralegals, store clerks, taxi drivers, and security guards. Even computer programmers, the people writing the algorithms that are taking on these tasks, will not be immune. By Frey and Osborne’s calculations, there is about a 50 percent chance that programming, too, will be outsourced to machines within the next two decades.
Turing’s Cathedral: The Origins of the Digital Universe
by George Dyson
Pantheon, 401 pp., $29.95
The digital universe came into existence, physically speaking, late in 1950, in Princeton, New Jersey, at the end of Olden Lane. That was when and where the first genuine computer—a high-speed, stored-program, all-purpose digital-reckoning device—stirred into action. It had been wired together, largely out of military surplus components, in a one-story cement-block building that the Institute for Advanced Study had constructed for the purpose. The new machine was dubbed MANIAC, an acronym of “mathematical and numerical integrator and computer.”
The Innovators Behind the Information Age
The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution. BY WALTER ISAACSON. Simon & Schuster, 2014, 560 pp. $35.00.
Zero to One: Notes on Startups, or How to Build the Future. BY PETER THIEL WITH BLAKE MASTERS. Crown Business, 2014, 224 pp. $27.00.
In the grand scope of human history, technological progress is actually a surprisingly new phenomenon. While there had always been the occasional new invention or technological breakthrough, it wasn’t until the Industrial Revolution that sustained technological progress became a reality—and, with it, the possibility of steadily rising living standards. For most of the past two centuries, that progress was most visible in the industrial and agricultural realms. But over the past 60 years or so, the lion’s share of innovation has come from a single sector: what is now loosely called “information technology.” When thinking about innovation in the United States today, the first (and sometimes only) place that comes to mind is Silicon Valley. In the simplest sense, Walter Isaacson’s The Innovators explains how that happened and, in the process, sheds some interesting light on what drives innovation more generally. Continue lendo “Thinkers and Tinkerers”
Supporters of the National Security Agency inevitably defend its sweeping collection of phone and Internet records on the ground that it is only collecting so-called “metadata”—who you call, when you call, how long you talk. Since this does not include the actual content of the communications, the threat to privacy is said to be negligible. That argument is profoundly misleading.
Of course knowing the content of a call can be crucial to establishing a particular threat. But metadata alone can provide an extremely detailed picture of a person’s most intimate associations and interests, and it’s actually much easier as a technological matter to search huge amounts of metadata than to listen to millions of phone calls. As NSA General Counsel Stewart Baker has said, “metadata absolutely tells you everything about somebody’s life. If you have enough metadata, you don’t really need content.” When I quoted Baker at a recent debate at Johns Hopkins University, my opponent, General Michael Hayden, former director of the NSA and the CIA, called Baker’s comment “absolutely correct,” and raised him one, asserting, “We kill people based on metadata.” Continue lendo “‘We Kill People Based on Metadata’”
The internet promised to feed our minds with knowledge. What have we learned? That our minds need more than that
On my morning bus into town, every teenager and every grown-up sits there staring into their little infinity machine: a pocket-sized window onto more words than any of us could ever read, more music than we could ever listen to, more pictures of people getting naked than we could ever get off to. Until a few years ago, it was unthinkable, this cornucopia of information. Those of us who were already more or less adults when it arrived wonder at how different it must be to be young now. ‘How can any kid be bored when they have Google?’ I remember hearing someone ask. Continue lendo “What good is information?”