If you're new to science, start with the basics. 

Science’s most vital concepts and ideas—some of which have been around for centuries, others of which we’ve only uncovered in recent decades—make the perfect starting point for a deeper dive into regions of the scientific world you’ve long wanted to take a closer look at. Whether it’s evolution, electromagnetism, thermodynamics, or the nature of matter, these and other eye-opening concepts are all connected by the profound role they play in our everyday lives and in our larger understanding of the world.

 


I love setting the record straight. 

Much of the layperson's knowledge of the brain is predicated on a lack of understanding. To build a more straightforward, accurate picture of current breakthroughs in neuroscience, you have shatter popular brain myths. Each of these 24 lectures takes as its focus a single powerful, prevalent myth, and uses it as a launch pad from which to explore myriad topics in neuroscience: decision making, memory, dreams, emotions, neuroplasticity, consciousness, mental illness, and much more. I don't just settle for obliterating myths - I replace them with more interesting scientific facts. The result is an eye-opening adventure into why the brain works the way it does. You'll be left with insights that will help you better determine the hard scientific truths behind the breaking news (and myths) of tomorrow. This course has been a non-fiction best seller on Audible.com.


The many micro decisions we make every day have long-term consequences.

When the course development team at Great Courses asked me to do another lecture series for them, this time focusing on the impact of digital technology on the human brain, I had a feeling there wouldn’t be much to tell. After all, digital technology has been around for a mere half century. The human brain has been evolving for 3.5 million years. Reading and writing, when compared to our ancient brains, are quite new, and have only become common since the invention of the printing press some 600 years ago. Digital technology, needless to say, is even newer. It couldn’t possibly be changing our brains already, when 600 years of reading and writing hasn’t left a dent.

In short, I hadn’t been worried about the effect that technology was having on our brains in the long term; in fact, I’d always been fairly permissive with screen time when it came to my own kids. Then I started the research. And as I was reading study after study on the subtle ways that technology can nudge our behavior one way or another, I began thinking about my own research on how our brain cells form new memories. Then it began to dawn on me that the question we must reckon with isn’t whether digital technology is changing our neuroanatomy, it’s how our behavior is being shaped by it. I was also left wondering whether digital technology has the potential for outsized effects. Is it in fact a superstimulus, drawing all of our attention and pushing us to act in specific ways?

After all, the vast majority of our brain function has evolved to help us optimize our behavior, so that we can survive and reproduce, so that we can live amongst other humans and craft an environment that benefits our genes. When the landscape changes, our brains adapt very quickly.