26/52: Our thin layer of existence.

Hawaii was an absolute treat for the senses, where almost every single day we witnessed something that was breathtaking: be it the sun setting into the infinite expanse of the orange horizon, the warm saltiness of the sea water on the skin that's just cool enough to alert the peripheries, or the chirping of countless coqui frogs going in and out of synchrony while the jungle leaves rustle peacefully under the backdrop of the star-lit night. Words are not enough to really convey these perceptions, and to be honest, neither are pictures, but we tried anyway. One idea that cannot be conveyed without words, however, is the feeling of our minuteness I got - both me personally and human lives in general - when I witnessed just a little more of the world around us.

Traveling to new places is always an eye opening experience, especially when immersed in the ecology and culture of a foreign place. This was different, though - it revealed to me something that I foolishly thought I already knew intimately - the world in which I live. It's like this: imagine you're about to visit someone's house for the first time. It's cool, because you know you'll see things you've never seen before, or at least how the same things might be arranged differently, so it's an expected kind of novelty. Now imagine sitting on your own couch at home, with an abundance of familiarity surrounding you. Everything was either put there by you, or it's been there for so long that it might as well have come with the place. All of a sudden, someone whispers a few words into your ears and you watch the familiarity in front of you unfold into an entirely new experience, realizing for the first time that there is so much more to your home than meets the eye. Hawaii gave me that feeling about this "world", "my" world, and not just through its exquisite wild life on the surface, but extending from the depth of the ocean to the stars above. The whole two weeks were full of moments like those, but I will just describe a few things that happened over a span of 48 hours on the Big Island (Hawaii Island). 

The Earth shaping itself
The state of Hawaii is a chain of islands formed by underwater volcanic activity. I knew this, and it makes sense. How else does a chain of islands emerge in the middle of the ocean? I didn't know, though, that there are active volcanoes on the Big Island. That fact seems routine enough when you read about it, but being there and witnessing it is another thing. We visited the Volcano National Park, where the Halemaumau Crater spews out a thick stream of never-ending smoke during the day, and transforms into a scary demonic pit at night. The park itself is a huge area of land that surrounds the crater, as well as the aftermath of some of the more explosive eruptions from a few decades ago. The landscape is incredibly eerie. It simultaneously makes me appreciate the wrath of Mother Earth, fear her swiftness in taking life away, and marvel at the incredible youth of the land beneath my feet and its newly sprouted inhabitants. When I think of a young Earth, I think of spring and budding greens. But here, youth is charred black, porous, and honestly looks kinda deadly and downright alien. 

The youthfulness of the land was further exhibited the next night, when we went to the edge of the park where lava is pouring out down the slope of the mountain and into the ocean. The feeling that we are standing there and witnessing NEW EARTH BEING BORN is truly incredible. Land is literally being formed around us, and the rocks we stepped on were younger than any of their visitors (and there were some hardy toddlers braving the lava rock hikes in the pitch black night). It was a sublime reminder that the world around us is dynamic, constantly morphing, swallowing itself and rebirthing itself - not only do plants and animals cycle through life and death, so too does the Earth they stand on.

Shit I never ever think about.

The world outside of our world
I've always lived in crowded places with extremely dense light pollution. Among those, La Jolla is probably the only place where I can regularly see more than a handful of stars at night. I've heard of friends going out of the city to star gaze, but have never done it myself, nor have I ever really camped in my life (I know). So imagine my awe when we went three quarters way up Mauna Kea (9300 ft), which is the dormant volcano on the Big Island and peaks at almost 14000 ft!!! There a lot of cool little tidbits about this mountain, one of which being that the base of it is actually deep under the ocean, so deep that if you measured from the base to its peak, it's just slightly taller than Mount Everest at 33,000 ft. Driving up to observatory altitude is the embodiment of "0 to 100 real quick". I think we went from beach to 9300 ft in about an hour? On the way up, we had to drive through a layer of super dense condensation (aka clouds), and it is a local saying that many people hit the invisible cows on the way up and down the mountain because visibility around the foggy area is no more than about 10 m ahead of you. But beyond that, the sky above feels like it reaches the depth of the universe. After nightfall, it's as if we were transported to another dimension or planet outside of our own, because I've never ever seen that many stars shining so brightly. Apparently, from Mauna Kea, one can see every star available in the northern half of the sky, and about 80-90% (?) in the southern sky, because, you know, it's a tall ass mountain. 

Standing under that diamond studded ceiling, we got to see a lot of astronomical phenomena firsthand, through our own eyes (and sometimes through a telescope). For example, there were guides at the visitor information center that set up small telescopes for the crowd to take a closer look at the stars, and I actually saw for the first time Saturn and its ring. It looks like a miniature, cartoon version of the Saturn I'm used to seeing in books and films: a small tilted ring encircling a smaller dot, both unicolor with a gray sheen. It was pretty neat. We also saw the ISS racing through the night sky in a perfect broad curve, and several shooting stars. By far the most indescribable feeling, though, was the smallness of humans and our planet under such a majestic sky. Standing on top of the cold peak, it was like the universe and all its mysteries were suddenly opened to me - I am directly experiencing, for the first time ever, how vast the space is out there and how little we really knew.

It wasn't quite a religious moment, but that was as close as I've ever gotten to marveling in the creation of some higher being.

The world within our world
I've posted this before, and I have to post it again. I've watched this video myself about 20 times now, and every time I do, I can't help but have a big stupid grin on my face. There are just so many completely spontaneous opportunities to witness animals enjoying themselves, be it a manta ray tumbling around, a family of sea turtle surfing the current, or a pack of dolphins playing hide and seek with us in the bay.

As a surface-dweller, my idea of life is mostly concentrated around my altitude and on dry land. Rarely do my thoughts venture out into the other 70% of this planet. The waters around Hawaii, though, really made me feel that there is life all around us. Perhaps they're different, and look a little strange, but life nonetheless. Breaking through the thin surface of the water that separates two worlds, you are instantaneously immersed in another storyline, like an invisible fly on the wall with the special privilege to witness the completely normal lives of all its characters. In those moment, I felt acknowledged and welcomed, and I hope I can do the same for them one day. It really makes me question, even now, the extent of other kinds of cognition, beyond our simply human ideals.

After all, we are but a thin layer of existence in a much, much larger whole.

25/52: First research paper published!

Because I'm quite short on my 52 posts this year, I'm stealing this from the lab blog (which I wrote!). But hey! First research paper published, jeez what a long process...

Highlights (tl;dr)

The overarching goal of our recent NeuroImage paper (PDF) is to make inferences about the brain’s synaptic/molecular-level processes using large-scale (very much non-molecular or microscopic) electrical recordings. In the following blog post, I will take you through the concept of excitation-inhibition (EI) balance, why it’s important to quantify, and how we go about doing so in the paper, which is the novel contribution. It’s aimed at a broad audience, so there are a lot of analogies and oversimplifications, and I refer you to the paper itself for the gory details. At the end, I reflect a little on the process and talk about the real (untold) story of how this paper came to be. 

A Tale of Two Forces

Inside all of our brains, there are two fundamental and opposing forces – no, not good and evil – excitation and inhibition. Excitatory input, well, “excites” a neuron, causing it to depolarize (become more positively charged internally) and fire off an action potential if enough excitatory inputs converge. This is the fundamental mechanism by which neurons communicate: shorts bursts of electrical impulses. Inhibitory inputs, on the other hand, do exactly the opposite: they hyperpolarize a neuron, making it less likely to fire an action potential. Not to be hyperbolic, but since before you were born these two forces were waging war with and balancing one another through embryonic development, infancy, childhood, adulthood, and till death. There are lots of molecular mechanisms for excitation and inhibition, but for the most part, “excitatory neurons” are responsible for sending excitation via a neurotransmitter called glutamate, and “inhibitory neurons” are responsible for inhibition via GABA.


Like all great rivalries (think Batman and Joker, Celtics and Lakers), these two forces cannot exist without each other, but they also keep each other in check: too much excitation leads the brain to have run-away activity, such as what happens in seizure, while too much inhibition shuts everything down, as happens during sleep, anesthesia, or being drunk. This makes intuitive sense, and scientists have empirically validated this “excitation-inhibition balance” concept numerous times. This EI balance, as it’s called, is ubiquitous under normal conditions, and has been proposed to be crucial for neural computation, the routing of information in the brain, and many other processes. Furthermore, it’s been hypothesized, with some experimental evidence in animals, that an imbalance of excitation and inhibition is the cause (or result) of many neurological and psychiatric disorders, including epilepsy, schizophrenia, and autism, just to name a few.

Finding Balance

Given how important this intricate balance is, it is actually quite difficult to measure at any moment the ratio between excitatory and inhibitory inputs. I mentioned above that there is empirical evidence for balance and imbalance. However, in the vast majority of these cases, measurements are done by poking tiny electrodes into single neurons and, via a protocol called voltage clamping, scientists record within a single neuron how much excitatory and inhibitory input that neuron is receiving. Because the setup is so delicate, it’s often done in slices of brain tissue kept alive in a dish, or sometimes in a head-fixed, anesthetized mouse or rat – basically, in brain tissue that can’t move much, but not in humans. I mean, imagine doing this in the intact brain of a living human – yeah, I can’t either. And as far as I know, it’s never been done. This presents a pretty big conundrum: if we want to link a psychiatric disorder to an improper ratio between excitation and inhibition in the human brain directly, but we can’t actually measure that thing, how can we corroborate that EI (im)balance matters in the way we think it does?


Our Approach: Parsing Balance From “Background Noise”

This is exactly the problem we try to solve in our recent paper published in NeuroImage: how might one estimate the ratio between excitation and inhibition in a brain region without having to invasively record from within a brain cell (which is not something most people would like to happen to them)?

Well, recording inside a brain cell is hard, but recording outside brain cells – extracellularly – is a LOT easier. It’s still pretty invasive, depending on the technique, but much safer and more feasible in moving, living, behaving people. Of course, recording outside the brain cell is not the same as recording the inside – when we record electrical fluctuations in the space around neurons, rather from within or right next to a single neuron, we’re picking up the activity of thousands to millions of cells all mixed up together.

The first critical idea of our paper was that this aggregate signal – often referred to as the local field potential (LFP) – reflects excitatory and inhibitory inputs onto a large population of local cells, not just a single one. Therefore, we should be able to get a general estimate of balance by decoding this aggregate signal. The second piece of critical information was the realization that (for the most part) excitatory inputs are fast and inhibitory inputs are slow so that, even when they are mixed together from millions of different sources like in the LFP signal, we are still able to separate their effects: not in time, but in the frequency-domain (see our frequency domain tutorial).

A: LFP model with excitatory and inhibitory inputs; B: the time course of E and I inputs from a single action potential; C: simulated synaptic inputs (blue and red) and LFP (black); F: LFP index of E:I ratio

A: LFP model with excitatory and inhibitory inputs; B: the time course of E and I inputs from a single action potential; C: simulated synaptic inputs (blue and red) and LFP (black); F: LFP index of E:I ratio

Combining Computational Modeling and Empirical (Open) Data

Pursuing this line of reasoning, we simulated populations of neurons in silico and looked at how their activity would generate a local field potential recording. What this means is that we can generate, in a computer simulation, different ratios of excitatory or inhibitory inputs into a brain region and see how that influences the simulated LFP. Through this computational model we found an index for the relative ratio between excitation and inhibition.

For those of you that are into frequency-domain analysis of neural signal, this index is the 1/f power law exponent of the LFP power spectrum. Let’s unpack that a bit. In the figure above (panel B) you can see that the excitatory electrical currents (blue) that contribute to the LFP shoot up in voltage really quickly—within a few thousands of a second—and then slowly decay back down to zero. In contrast, the inhibitory currents (red) also shoot up pretty quickly—but not as quickly—and then decay back to zero much more slowly than the excitatory inputs. When you add up thousands of these currents happening all at different times, the simulated voltage (panel C, black) looks to us humans a lot like noise. But through the mathematical magic of the Fourier transform, when we look at this same signal’s frequency representation, they’re clearly distinguishable!

More technically, the idea is that the ratio between fast excitation and slow inhibition should be represented by the relative strength between high-frequency (rapidly fluctuating) and low-frequency (slowly fluctuating) signals. With this hypothesis in hand, we were able to make use of several publicly available databases of neural recordings to validate the predictions made by our computational models in a few different ways. One example from the paper: we were lucky enough to find a recording from macaque monkeys undergoing anesthesia, and the anesthetic agent, propofol, acts through amplifying inhibitory inputs in the brain at GABA synapses. Therefore, we predicted that when the monkey goes under, we should see a corresponding change in the power law exponent, and that’s exactly what we found! As you can see below, our EI index remains relatively stable during the awake state, then immediately shoots down toward an inhibition-greater-than-excitation regime during the anesthetized state before coming back to baseline after the anesthesia wears off.


Takeaways and Disclaimers

So to summarize, we were able to make predictions, borne out of observations from previous physiological experiments and our own computational modeling, and then validate these predictions using data from existing databases to draw a link between EI balance, which is a cellular-level process, and the local field potential, which is an aggregate circuit-level signal. Personally, I think that bridging the gap between these different levels of description in the brain is super interesting, and it’s one way for us to confirm our understanding of how the brain gives rise to cognition and behavior at multiple scales. Furthermore, we can now make use of the theoretical idea of EI balance in places where it was previously inaccessible, such as a human patients responding to treatments.

Before I wrap up, I just want to point out that this paper does not conclusively show that EI balance directly shifts the power law exponent – what we show is a suggestive correlation. Nor does the correlation hold under all circumstances. We had to make a lot of assumptions in our model and the data we found, such as the noise-like process by which we generated the model inputs. I’m not throwing this out here to inhibit the excitement (hah, hah), but rather to limit the scope of our claim, especially for a public-facing blog piece like this.

Rather, ours is the first step of an ongoing investigation, and although we will probably find evidence that corroborates and contradicts our findings later on, it’s important that anyone reading this and getting excited (hah) about it understands that we do not, and likely will not, have the last word on this. Ultimately, though, I believe we stumbled onto something pretty cool and we’ll definitely follow up on those assumptions one by one, and hopefully have more blog posts to come!

Some Personal Reflection

This project was my first real scientific research project in grad school, and it definitely created in me a lot of joy and excitement, as well as caused a fair amount of brooding. As a whole, I really enjoyed the process of building a computational model, even if it was quite simple, and using the predictions from that to inform further empirical investigations. As I mentioned, I think we really need to bridge the gap between molecular-level mechanisms in the brain and circuit/organism-level “neural markers”, and computational modeling work allows us to do that in situations where it would be intractable for many reasons. I certainly subscribe to the notion that combining theoretical/computational work with empirical data is an exciting and fruitful line of research, because it fills a space between two successful but largely non-overlapping subfields in neuroscience (though that trend is now changing).

Also, the fact that we were able to test our predictions on publicly available data was such a blessing, as we simply did not have the capacity, as a new lab, to do those in vitro and in vivo experiments ourselves. However, that meant combing through tons and tons of data where there might have been unlabeled or badly labeled information, only to reach the conclusion that the data is unusable for our purposes. There was some (a lot) of headbanging due to this, but ultimately, we found useful (FREE!) data and I’m very grateful for the people that made them available: CRCNS and Buzsaki Lab, Neurotycho and Fujii Lab, as well as many friends and collaborators that donated data for us to test different routes. To support this open-access endeavor, all code used to produce the analysis and figures are on our lab GitHub, found here.

The Untold Story

One last note, for those of you that find the process of scientific discovery interesting: in this blog post, I tried to write the story as a lay-friendly CliffsNotes version of the paper, starting with the importance of EI balance and the motivation to find an accessible index of it in the LFP, then outlining how we went about solving that problem. That’s the scientific story, and while not false, it’s not chronological.

The actual story began with Brad’s 2015 paper showing that aging is associated with 1/f changes. That was actually what first interested me back when I started in 2014 – this seemingly ubiquitous phenomenon (1/f scaling) in neural data. After digging a bit to find various accounts for how 1/f observations arise in nature, we decided to just simulate the LFP ourselves and see what happens. Turns out, the 1/f naturally falls out of the temporal profile of synaptic currents, which both have exponential rise and decay.

Our model contained what I thought to be the bare minimum: excitatory and inhibitory currents. At that point, I didn’t have a clue about what EI balance was and what it has been linked to. I think I was twiddling parameters one day, and realized that changing the relative weight of E and I inputs will cause the 1/f exponent (or slope) to change because of their different time constants. Then, like any modern-day graduate student, I Googled to see if this is something that actually happens in vivo, and the rest was history. This little anecdote really just speaks to the serendipity of science, and it couldn’t have happened without the many hours of spontaneous discussions in the lab, which I’m also very grateful for, and Google. I think these little stories really liven up the otherwise logical world of science, and I’d love to read about such stories from other people!

24/52: Holy sh*t I swam with a manta ray: the most incredible and awe-inspiring thing I've seen.

We saw a lot of wonderful and incredible sights in Hawaii over the last two weeks, all of which I will shortly detail. But the one thing that stood out the most was this moment, so much so that I think it deserves its own entry.

We were snorkeling at Two Steps Beach on the Big Island, near Captain Cook. It was a treasure trove of marine life with an abundance of corals, but by that point we had been in Hawaii for a week and half, and everything I saw became unremarkable in that everything was routinely remarkable. I was swimming over to the other end of the bay to meet Mei on the beach, still hopeful that I might see a dolphin or two, when this great white shape appeared - about the size of me - and fanned out in front of me.

My immediate feeling was that of fear, since it's not a run-of-the-mill experience to come face to face with a wild animal the size of myself on land, and who knows if this creature is benevolent. Even if it doesn't eat me, what if I pissed it off by swimming near its home? This feeling of apprehension never quite went away, but gave way to so many more: excitement at this rare opportunity, curiosity in this strange beast, joy in watching it tumble round and round in the water. So I followed it around for two minutes, just out of reach. Not that it seemed to care.

I hate to anthropomorphize nature but in this moment, it's impossible to not ponder the mind of this being, as it performed its strange dance. Perhaps joyously, perhaps hungrily. All of these thoughts and emotions combined themselves into a single concept - awe - and I found myself completely immersed in the presence of another entity in its undisturbed natural habitat, wondering just how many more lives like this one surrounded our thin surface of existence, evading our consciousness.

Without further ado (only edit I made was adding the music, over which you can still sometimes hear my gasps):

23/52: What is the hardest scientific endeavor of all? (Answer: neuroscience)

"The brain is the most complex thing in the universe."

If there was one quote that grinds my gears more than all else, it is this one. It is simply an idiotic thing to say: the brain is not the most complex thing in the universe; the universe, which contains billions and billions of brains small and big, is the most complex thing in the universe. Even the interaction between two brains, certainly, is more complex than the brain itself. So why is neuroscience, the scientific study of the brain, the hardest of all?

Let's take, as two examples, quantum physics and astrophysics, fields that study the tiniest and the grandest objects in our universe. These are extremely difficult things to study, because to even observe the signals necessary to start answering the questions we've set forth, it takes some real human ingenuity and delicate engineering to construct devices that can give us reliable measurements we can then use to make inferences. Watch this really great video on LIGO and the detection of gravitational waves if you are not convinced, and coincidentally, it is a combination of both quantum physics and astrophysics (4:45 is the best part...why are they even wearing goggles?)

If that's not the craziest thing I've heard of, which is literally at the physical boundary of the universe (as we understand today) on both the smallest and the largest scales, I don't know what is. Maybe putting people on Mars? Maybe Elon Musk's new brain hat? Which brings me to how much more difficult neuroscience is. Actually, it's not just neuroscience, it's all scientific efforts that try to study some aspect of the human mind, like psychology and cognitive science, but not, for example, neurobiology. And the reason I believe this is not because neuroscience is intrinsically hard - it most certainly pales in comparison to many branches of the physical and even social sciences.

What makes it hard, I think, is that it is incredibly difficult for a human to be objective when we study the brain and the mind: the phenomena we are interested in explaining are the ones that occur on a daily basis in our mundane lives, like paying attention to traffic, perceiving color, etc, and it is precisely these subjective experiences that we not only draw inspiration from, but also try to dissect as if they are objective things in the universe. I certainly don't want to get into the debate of what is and what is not objectively real, a photon or consciousness, but I think we can agree that one is more objectively existing than the other, at least in terms of how we operationalize them scientifically. In fact, I think the more "dead" we think something is, the easier it is to study it objectively, which might explain the incredible disparity in our level of understanding of the brain compared to every other organ in our body. When we study the brain and the mind, confirmation bias does not only creep in at the objective scientific level, but the personal level as well. Yes, we can be diligent in checking results that confirm our hypotheses, but it is damn near impossible to be diligent in checking results that are consistent with our daily experiences and intuition. After all, if I've had these experiences, it must be the right, right? 

Why is it so hard? I'm not sure, but if I were to venture a guess, I think it is rooted in our wish to preserve our own identities, livingness, and humanness. One of the things that makes us humans feel special is the belief that we are special: yes, dogs and cows and rats and dolphins all have brains, but we must be special in some way? And if we were to lose this feeling of specialness in the pursuit of an objective understanding of the brain and our humanness, we would paradoxically lose this humanness altogether. In fact, I believe it is crucial that we treat the human brain like we treat any other organism on this planet in order to properly study it, but that would create such an intense dissonance, because at the end of the day, when we're done being neuroscientists, we're back to being regular people - a friend, a spouse, a parent - all of which requires this special humanness to maintain, this special belief that we're all the good guys at the end of the day.

I recently finished reading Paul Kalanithi's When Breath Becomes Air, and his story contextualized our scientific effort in understanding the brain in a way that's never explicitly occurred to me before - though I soon realized afterwards that it's always been an implicit motivation for me, and perhaps for many others - and that is the search for meaning. I believed that we can ultimately objectively define the meaning of our existence by understanding the brain, the organ that presumably gives us this sense of meaning - our joys, pains, struggles, triumphs, and every other thing we feel - in the first place. Now that I consciously think about it, I don't know whether this will ever happen: to objectively understand this sense of meaning, we may have to give up that there is any meaning in the first place, and simply describe our thoughts, movements, and interactions as physical quantities changing over time, like how we objectively describe ant colonies, economics, and a murmuration of birds.



22/52: The Disease of Productivity & Mindful Dishwashing

Since I started graduate school, I think, I've revered industriousness and have been deathly afraid of unproductivity. It's really weird, because I was just as gung-ho about doing well in high school and university, but in some sense I felt like at that time my life (and work) wasn't really mine, and that I was just doing the bare minimum required to get the grades I was supposed to get, but no more - I was on a bus being driven to and fro. Even though EngSci was extremely time-consuming, I still had some "down time", where it was set aside to just hang out with people, get stupid drunk, or go play basketball. Actually, it wasn't "set aside" as much as I was doing anything to not spend time on school. Could I have gotten, better grades, been more involved in extracurriculars, or developed SOME kind of interest or hobby outside of school? Probably. But to be honest, the concept of setting aside time to do things as an investment of my future never occurred to me, and I just did whatever I wanted to do.

The moment I started graduate school, though, there was a fundamental change in my mindset. The best way I can describe it is that I began to think of my time doing research/reading/whatever as if it was time working on my own business: I get back as much as I put in, and no more, so if I want to be successful - and there is literally no bounds on what that means - I have to put in as much time as possible. In other words, every second of my time is a resource that I have to spend wisely to get the maximum return on investment I can get, and at this point in my life, it's through working or reading or better equipping myself for research one way or another (narrow-minded, I know). From that moment on, I was always trying to preoccupy myself with work, and I convinced myself that unlimited industriousness is good. Of course it is - there are countless numbers of motivational videos on Youtube, and many times more of quotes saying hard work is the secret to life. 

After three years, though, I'm starting to think it's becoming a condition of some sort that I cannot get rid of. I don't know what specifically I was thinking about, but one day I had a thought that perpetual and uncontrolled laziness or procrastination is like a disease, since a person may literally feel as if they have no control over their inability to get started on doing work. And it dawned on me - I have just as little control over my inability to NOT think about work. Of course, that's not to say that I'm always actually doing work, which is the stupid and ironic part, because I'm also starting to realize that, pushing myself or not, I end up doing about the same amount of work, except in one case, I feel extra bad that I'm taking time off so I just end up procrastinating doing something that I don't really want to do, like watching the same video of Gordon Ramsey cooking a steak 5+ times on Youtube. For a while, I tried to consciously set aside time to relax by doing things that's not work. But that never really worked, because I would just think that I could be spending this time doing work instead.

Now I realized I had it backwards: I was depending on the activity itself to relax me, but the truth is that relaxation comes from the conscious decision to do this thing - anything - instead of thinking about work. This happened a lot for meditation: I noticed that during times when there are no immediate deadlines, meditating is a huge boost to my energy level and general sense of wellbeing. But when I'm in a period of high stress because a deadline is approaching or just particularly busy with multiple things, meditating not only does not help, but makes the situation worse, because I end up getting distracted by work and then think that I should've just spent that time working instead. It was very frustrating because I thought meditating was suppose to make me more relaxed, not the opposite?! At some point, it dawned on me that meditation cannot make one's mind more relaxed, but it is the relaxed mind itself - one that happily embarks on the small journey of being mindful despite all the chaos engulfing and bombarding the mind with responsibilities and tasks - that makes one's mind more relaxed. In other words, more than half the battle is already won when I wholeheartedly commit myself to relaxing. Mind blown, right?

After a few more periods of alternating stress and relative idleness, I started to pick up on all the random little things that are truly indicators of my mental wellbeing. Willingness to meditate, of course, is probably a big one, precisely because it's something that is truly unnecessary, in the sense that it does not accomplish anything that "needs to get done". Going to the gym is another such thing. Household chores, in general, are pretty good litmus tests when I start to neglect them: dishes piling up in the sink even though I only cook like one meal a day, shirts not hung back up at the end of the day, etc. There is probably a direct inverse relationship between my daily cortisol level and how many times I flossed in the last week. Conversely, some things that I spend more time doing when I'm not doing well: Youtube, Twitter, and various other forms of social media, reading about random shit on Wikipedia or Buzzfeed, reading the news - though that's largely a non-issue now since I avoid that like the plague. Anyway, the takeaway here is that how I'm treating myself in the face of external pressures is the best indicator of my mental wellbeing. Obviously I'm not saying to goof off every time there is more responsibility at work or something. It is simply to say that perhaps we need more loving and caring, from ourselves, precisely when we are being demanded the most (wow that sounds super obvious when I write it down that way).