Skip to content →

Tag: lithium ion

What I've Changed My Mind on Over the 2010s

I’ve been reading a lot of year-end/decade-end reflections (as one does this time of year) — and while a part of me wanted to #humblebrag about how I got a 🏠/💍/👶 this decade 😇 — I thought it would be more interesting & profound to instead call out 10 worldviews & beliefs I had going into the 2010s that I no longer hold.

  1. Sales is an unimportant skill relative to hard work / being smart
    As a stereotypical “good Asian kid” 🤓, I was taught to focus on nailing the task. I still think that focus is important early in one’s life & career, but this decade has made me realize that everyone, whether they know it or not, has to sell — you sell to employers to hire you, academics/nonprofits sell to attract donors and grant funding, even institutional investors have to sell to their investors/limited partners. Its a skill at least as important (if not more so).
  2. Marriage is about finding your soul-mate and living happily ever after
    Having been married for slightly over half the decade, I’ve now come to believe that marriage is less about finding the perfect soul-mate (the “Hollywood version”) as it is about finding a life partner who you can actively choose to celebrate (despite and including their flaws, mistakes, and baggage). Its not that passionate love is unimportant, but its hard to rely on that alone to make a lifelong partnership work. I now believe that really boring-sounding things like how you make #adulting decisions and compatibility of communication style matter a lot more than things usually celebrated in fiction like the wedding planning, first dates, how nice your vacations together are, whether you can finish each other’s sentences, etc.
  3. Industrial policy doesn’t work
    I tend to be a big skeptic of big government policy — both because of unintended consequences and the risks of politicians picking winners. But, a decade of studying (and working with companies who operate in) East Asian economies and watching how subsidies and economies of scale have made Asia the heart of much of advanced manufacturing have forced me to reconsider. Its not that the negatives don’t happen (there are many examples of China screwing things up with heavy-handed policy) but its hard to seriously think about how the world works without recognizing the role that industrial policy played. For more on how land management and industrial policies impacted economic development in different Asian countries, check out Joe Studwell’s book How Asia Works
  4. Obesity & weight loss are simple — its just calories in & calories out
    From a pure physics perspective, weight gain is a “simple” thermodynamic equation of “calories in minus calories out”. But in working with companies focused on dealing with prediabetes/obesity, I’ve come to appreciate that this “logic” not only ignores the economic and social factors that make obesity a public health problem, it also overlooks that different kinds of foods drive different physiological responses. As an example that just begins to scratch the surface, one very well-controlled study (sadly, a rarity in the field) published in July showed that, even after controlling for exercise and calories, carbs, fat, fiber, and other nutrients present in a meal, diets consisting of processed foods resulted in greater weight-gain than a diet consisting of unprocessed foods
  5. Revering luminaries & leaders is a good thing
    Its very natural to be so compelled by an idea / movement that you find yourself idolizing the people spearheading it. The media feeds into this with popular memoirs & biographies and numerous articles about how you can think/be/act more like [Steve Jobs/Jeff Bezos/Warren Buffett/Barack Obama/etc]. But, over the past decade, I’ve come to feel that this sort of reverence leads to a pernicious laziness of thought. I can admire Steve Jobs for his brilliance in product design but do I want to copy his approach to management or his use of alternative medicine to treat his cancer or condoning how he treated his illegitimate daughter. I think its far better to appreciate an idea and the work of the key people behind it than to equate the piece of work with the person and get sucked in to that cult of personality.
  6. Startups are great place for everyone
    Call it being sucked into the Silicon valley ethos but for a long time I believed that startups were a great place for everyone to build a career: high speed path to learning & responsibility, ability to network with other folks, favorable venture funding, one of the only paths to getting stock in rapidly growing companies, low job seeking risk (since there’s an expectation that startups often fail or pivot). Several years spent working in VC and startups later, and, while I still agree with my list above, I’ve come to believe that startups are really not a great place for most people. The risk-reward is generally not great for all but the earliest of employees and the most successful of companies, and the “startups are great for learning” Kool-aid is oftentimes used to justify poor management and work practices. I still think its a great place for some (i.e. people who can tolerate more risk [b/c of personal wealth or a spouse with a stable high-paying job], who are knowingly optimizing for learning & responsibility, or who are true believers in a startup’s mission), but I frankly think most people don’t fit the bill.
  7. Microaggressions are just people being overly sensitive
    I’ve been blessed at having only rarely faced overt racism (telling me to go back to China 🙄 / or that I don’t belong in this country). It’s a product of both where I’ve spent most of my life (in urban areas on the coasts) and my career/socioeconomic status (its not great to be racist to a VC you’re trying to raise money from 🤑). But, having spent some dedicated time outside of those coastal areas this past decade and speaking with minorities who’ve lived there, I’ve become exposed to and more aware of “microaggressions”, forms of non-overt prejudice that are generally perpetrated without ill intent: questions like ‘so where are you really from?’ or comments like ‘you speak English really well!’. I once believed people complaining about these were simply being overly sensitive, but I’ve since become an active convert to the idea that, while these are certainly nowhere near as awful as overt hate crimes / racism, they are their own form of systematic prejudice which can, over time, grate and eat away at your sense of self-worth.
  8. The Western model (liberal democracy, free markets, global institutions) will reign unchallenged as a model for prosperity
    I once believed that the Western model of (relatively) liberal democracy, (relatively) free markets, and US/Europe-led global institutions was the only model of prosperity that would reign falling the collapse of the Soviet Union. While I probably wouldn’t have gone as far as Fukuyama did in proclaiming “the end of history”, I believed that the world was going to see authoritarian regimes increasingly globalize and embrace Western institutions. What I did not expect was the simultaneous rise of different models of success by countries like China and Saudi Arabia (who, frighteningly, now serve as models for still other countries to embrace), as well as a lasting backlash within the Western countries themselves (i.e. the rise of Trump, Brexit, “anti-globalism”, etc). This has fractured traditional political divides (hence the soul-searching that both major parties are undergoing in the US and the UK) and the election of illiberal populists in places like Mexico, Brazil, and Europe.
  9. Strategy trumps execution
    As a cerebral guy who spent the first years of his career in the last part of the 2000s as a strategy consultant, it shouldn’t be a surprise that much of my focus was on formulating smart business strategy. But having spent much of this decade focused on startups as well as having seen large companies like Apple, Amazon, and Netflix brilliantly out-execute companies with better ‘strategic positioning’ (Nokia, Blackberry, Walmart, big media), I’ve come around to a different understanding of how the two balance each other.
  10. We need to invent radically new solutions to solve the climate crisis
    Its going to be hard to do this one justice in this limited space — especially since I net out here very differently from Bill Gates — but going into this decade, I never would have expected that the cost of new solar or wind energy facilities could be cheaper than the cost of operating an existing coal plant. I never thought that lithium batteries or LEDs would get as cheap or as good as they are today (with signs that this progress will continue) or that the hottest IPO of the year would be an alternative food technology company (Beyond Meat) which will play a key role in helping us mitigate food/animal-related emissions. Despite the challenges of being a cleantech investor for much of the decade, its been a surprising bright spot to see how much pure smart capital and market forces have pushed many of the technologies we need. I still think we will need new policies and a huge amount of political willpower — I’d also like to see more progress made on long-duration energy storage, carbon capture, and industrial — but whereas I once believed that we’d need radically new energy technologies to thwart the worst of climate change, I am now much more of an optimist here than I was when the decade started.

Here’s to more worldview shifts in the coming decade!

Leave a Comment

HotChips 101

image This post is almost a week overdue thanks to a hectic work week. In any event, I spent last Monday and Tuesday immersed in the high performance chip world at the 2009 HotChips conference.

Now, full disclosure: I am not electrical engineer, nor was I even formally trained in computer science. At best, I can “understand” a technical presentation in a manner akin to how my high school biology teacher explained his “understanding” of the Chinese language: “I know enough to get in trouble.”
But despite all of that, I was given a rare look at a world that few non-engineers ever get to see, and yet it is one which has a dramatic impact on the technology sector given the importance of these cutting-edge chip technologies in computers, mobile phones, and consumer electronics.

And, here’s my business strategy/non-expert enthusiast view of six of the big highlights I took away from the conference and which best inform technology strategy:

  1. image We are 5-10 years behind on the software development technology needed to truly get performance power out of our new chips. Over the last decade, computer chip companies discovered that simply ramping up clock speeds (the Megahertz/Gigahertz number that everyone talks about when describing how fast a chip is) was not going to cut it as a way of improving computer performance (because of power consumption and heat issues). As a result, instead of making the cores (the processing engines) on a chip faster, chip companies like Intel resorted to adding more cores to each chip. The problem with this approach is that performance becomes highly dependent on software developers being able to create software which can figure out how to separate tasks across multiple cores and share resources effectively between them – something which is “one of the hardest if not the hardest systems challenge that we as an industry have ever face” (courtesy of UC Berkeley professor Dave Patterson). The result? Chip designers like Intel may innovate to the moon, but unless software techniques catch up, we won’t get to see any of that. Is it no wonder, then, that Intel bought multi-core software technology company RapidMind or that other chip designers like IBM and Sun are so heavily committed to creating software products to help developers make use of their chips? (Note: the image to the right is an Apple ad of an Intel bunny suit smoked by the PowerPC chip technology that they used to use)
  2. Computer performance may become more dependent on chip accelerator technologies. The traditional performance “engine” of a computer was the CPU, a product which has made the likes of Intel and IBM fabulously wealthy. But, the CPU is a general-purpose “engine” – a jack of all trades, but a master of none. In response to this, companies like NVIDIA, led by HotChips keynote speaker Jen-Hsun Huang, have begun pushing graphics chips (GPUs), traditionally used for gaming or editing movies, as specialized engines for computing power. I’ve discussed this a number of times over at the Bench Press blog, but the basic idea is that instead of using the jack-of-all-trades-and-master-of-none CPU, a system should use specialized chips to address specialized needs. Because a lot of computing power is burnt doing work that is heavy on the mathematical tasks that a GPU is suited to do, or the signal processing work that a digital signal processor might be better at, or the cryptography work that a cryptography accelerator is better suited for, this opens the doorway to the use of other chip technologies in our computers. NVIDIA’s GPU solution is one of the most mature, as they’ve spent a number of years developing a solution they call CUDA, but there was definitely a clear message: as the performance that we care about becomes more and more specialized (like graphics or number crunching or security), special chip accelerators will become more and more important.
    image
  3. Designing high-speed chips is now less and less about “chip speed” and more and more about memory and input/output. An interesting blog post by Gustavo Duarte highlighted something very fascinating to me: your CPU spends most of its time waiting for things to do. So much time, in fact, that the best way to speed up your chip is not to speed up your processing engine, but to speed up getting tasks into your chip’s processing cores. The biological analogy to this is something called a perfect enzyme – an enzyme that works so fast that its speed is limited by how quickly it can get ahold of things to work on. As a result, every chip presentation spent ~2/3 of the time talking about managing memory (where the chip stores the instructions it will work on) and managing how quickly instructions from the outside (like from your keyboard) get to the chip’s processing cores. In fact, one of the IBM POWER7 presentations spent almost the entire time discussing the POWER7’s use and management of embedded DRAM technology to speed up how quickly tasks can get to the processing cores.
  4. Moore’s Law may no longer be as generous as it used to be. I mentioned before that one of the big “facts of life” in the technology space is the ability of the next product to be cheaper, faster, and better than the last – something I attributed to Moore’s Law (an observation that chip technology doubles in capability every ~2 years). At HotChips, there was a imagefascinating panel discussing the future of Moore’s Law, mainly asking the question of (a) will Moore’s Law continue to deliver benefits and (b) what happens if it stops? The answers were not very uplifting. While there was a wide range of opinions on how much we’d be able to squeeze out of Moore’s Law going forward, there was broad consensus that the days of just letting Moore’s Law lower your costs, reduce your energy bill, and increase your performance simultaneously were over. The amount of money it costs to design next-generation chips has grown exponentially (one panelist cited a cost of $60 million just to start a new custom project), and the amount of money it costs to operate a semiconductor factory have skyrocketed into the billions. And, as one panelist put it, constantly riding the Moore’s Law technology wave has forced the industry to rely on “tricks” which reduced the delivery of all the benefits that Moore’s Law was typically able to bring about. The panelists warned that future chip innovations were going to be driven more and more by design and software rather than blindly following Moore’s Law and that unless new ways to develop chips emerged, the chip industry itself could find itself slowing its progress.
  5. Power management is top of mind. The second keynote speaker, EA Chief Creative Officer Richard Hilleman noted something which gave me significant pause. He said that in 2009, China will probably produce more electric cars in one year than have ever been produced in all of history. The impact to the electronics industry? It will soon be very hard to find and imagevery expensive to buy batteries. This, coupled with the desires of consumers everywhere to have longer battery lives for their computers, phones, and devices means that managing power consumption is critical for chip designers. In each presentation I watched, I saw the designers roll out a number of power management techniques – the most amusing of which was employed by IBM’s new POWER7 uber-chip. The POWER7 could implement four different low-power modes (so that the system could tune its power consumption), which were humorously named: doze, nap, sleep, and “Rip van Winkle”.
  6. Chip designers can no longer just build “the latest and greatest”. There used to be one playbook in the Silicon Valley – build what you did a year ago, but make it faster. That playbook is fast becoming irrelevant. No longer can Silicon Valley just count on people to buy bigger and faster computers to run the latest and greatest applications. Instead, people are choosing to buy cheaper computers to run Facebook and Gmail, which, while interesting and useful, no longer need the CPU or monitor with the greatest “digital horsepower.” EA’s Richard Hilleman noted that this trend was especially important in the gaming indimageustry. Where before, the gaming industry focused on hardcore gamers who spent hours and hours building their systems and playing immersive games, today, the industry is keen on building games with clever mechanics (e.g. a Guitar Hero or a game for the Nintendo Wii) for people with short attention spans who aren’t willing to spend hours holed up in front of their televisions. Instead of focusing on pure graphical horsepower, gaming companies today want to build games which can be social experiences (like World of Warcraft) or which can be played across many devices (like smartphones or over social networks). With stores like Gamestop on the rise, gaming companies can no longer count on just selling games, they need to think up how to sell “virtual goods” (like upgrades to your character/weapons) or in-game advertising (a Coke billboard in your game?) or encourage users to subscribe. What this all means is that, to stay relevant, technology companies can no longer just gamble on their ability to make yesterday’s product faster, they have to make them better too.

There was a lot more that happened at HotChips than I can describe here (and I skipped over a lot of the more techy details), but those were six of the most interesting messages that I left the conference with, and I am wondering if I can get my firm to pay for another trip next year!

Oh, and just to brag, while at HotChips, I got to check out a demo of the potential blockbuster game Batman: Arkham Asylum while checking out NVIDIA’s 3D Vision product! And I have to say, I’m very impressed by both products – and am now very tempted by NVIDIA’s Buy a GeForce card, get Batman: Arkham Asylum free offer.

(Image credit: Intel bunny smoked ad) (Image credit: GPU computing power) (Image Credit: brick wall) (Image – Rip Van Winkle) (Image – World of Warcraft box art)

One Comment