15/52: Language: beetles, Bruce Lee, & All the Light We Cannot See

Language is often thought of as the pinnacle of human intelligence. It let's us communicate with each other unreasonably effectively on a day to day basis. Well, more effectively than just pointing and grunting. But recently, I've been finding myself becoming more and more unsatisfied with language, in two different ways.

First, it is the ambiguity of language used in cognitive science and neuroscience, especially with terms having to do with mental processes, such as "memory". I raised this annoyance about cognitive science a few weeks ago, and part of the problem is obviously the immaturity of the field as a whole. In other words, we don't have good definitions locked down because we haven't figured out what memory is. But the other problem stems from language itself, I think, or the subjectivity of language. Many expositions and thought experiments have discussed this issue at length, for example, the Wittgenstein's beetle experiment. In essence, when language is used to discuss something that is purely rooted in the mental realm, i.e., our individual experiences of our own mental processes, we can never be certain that we're talking about the same thing. In most circumstances, the concept represented by a word in the language is rooted in a physical, existing entity, like an apple. We can have a philosophical discussion about whether apple truly exists, either the physical entity or the concept, but I think it's not too controversial to say that apple is more of an existing physical entity than a cognitive process like "attention". So the question here is, because we each experience "attention" differently, just as we experience "red", can we ever pin down a definition for it? "Redness" is a purely made up concept, but it has enough of a physical representation that we can all point to and just agree on the fact that it is red. "Attention", however, is something that all of us experience in our daily lives, but cannot ever point to in the external world. We can talk about a specific experience in detail and use that as an example of attention, and we can certainly conduct psychophysics studies that objectively measure a specific aspect of human behavior which we then can agree to call "attention", but our study of it can never catch up to our ever-evolving experience. A good example of that is "language" itself. What is language? We have formally studied language for quite a long time, but our daily experiences with it have evolved to quite a different thing. Are emojis language? Are they a part of A language, or are they a language of their own? And if (or when) the formal study of language has accepted Emojism as a proper sub-discipline, new ones will have evolved. Hence, does it ever make sense to study something that is inherently ever-changing? Does that mean that cognitive science then, as a whole, is futile? Because the goal is to understand human cognition but you can't scientifically study something that you can't define, and you certainly can't define something if it's constantly evolving.

My second issue with language is not with language itself, per se, but my own command of it. There is an interview of Bruce Lee talking about how to express oneself honestly through martial arts, that I always come back to, because I think this authenticity is transferable to many other aspect of our lives. Long story short, the difficulty of mastering a skill, whatever it is, lies in aligning your mental and physical self, such that you're able to express, honestly and accurately, what you've envisioned (let's set aside for now my distaste for a mental representation). In martial arts, it might be the envisioning of a specific sequence of movements. In music, it is the same, but with the outcome of sounds that express the desired emotion. In language, it is a lot more literal - to communicate, through words, one's true thoughts and feelings. It's an easy problem at first glance, because words are defined in the dictionary, and we simply have to construct speech with the blueprint of our thoughts the meaning we want to convey. But in practice, it is of course a lot more messy that that, because words don't mean the same for everyone, and in that sense, using the correct set of words is not the true objective, but rather to convey the experience that we want to convey such that the person listening can understand. The speaker can clearly expect certain baseline commonality in the way the two of them understand words, if they speak the same language. But ultimately, how does the speaker choose to convey meaning if their words are not grounded by the same experiences or mental representations as the listener? Vocabulary is a superficial, but, to a degree, sufficient solution for this problem. Learning new words means you can use more precise language to describe very specific thoughts. I don't speak German, but that language seems like a pretty good example of this, where you can find really long words that describe very specific things. English, on the other hand, is rather poor in this regard. Still, these words depend on common understanding (or shared context), and using superfluous words might actually hinder understanding.

I came to realize my own poor grasp of language through a simultaneous process of reading and speaking. I recently finished the novel "All the Light We Cannot See", and it is, in my limited literary experience, a truly wonderful piece of writing. The first half of the story is rather mundane, but the author writes it with such sublime detail that, for the first time, reading for the sake of reading, and not for the plot line, gave me joy. Several passages live vividly in my mind, I will describe just one. Our young protagonist is a blind French girl who lived with her father in Paris at the start of World War 2. She's just learned Braille a few years ago and is furiously devouring literary classics through her fingertips. On this particular day, however, the German invasion of Paris has begun and her father was summoned to attend to museum business, and she anxiously awaits his return:

"Marie-Laure sits cross-legged on the floor of the key pound and tries to read her novel. Captain Nemo is about to take Professor Aronnax and his companions on an underwater stroll through oyster beds to hunt for pearls, but Aronnax is afraid of the prospect of sharks, and though she longs to know what will happen, the sentences disintegrate across the page. Words devolve into letters, letters into unintelligible bumps. She feels as if big mitts have been drawn over each hand."

You know that feeling you get when you cannot focus on reading something and you keep going back to the sentence over and over again, each time as if the words disappear into a puff of smoke? I felt like that last passage was the first time I've ever been able to truly describe that feeling adequately in words, and it is from a description of a girl who reads with her fingers. And, from that, I felt like I was able to immediately relate to how she felt, even though the way we read cannot be more different. Contrary to what I said above (my first issue), something was able to link these two completely orthogonal and disparate experiences. Of course, I will never know what she was truly feeling, or what the author truly intended to convey, but through this passage, I think I was able to experience for the first time the exquisite power of a literary device. Having experienced how precisely one can describe/invoke an experience with simple, common words, I realized how poorly I take advantage of this in my everyday speech and writing. Part of the problem is that I am really quite literal, so I often feel bound by the "right" definition of words and phrases, to the point that semantic accuracy supersedes imagination, which is unsurprising given my affinity for math and programming. A great example of this is when I struggle to use phrases that I don't understand the social context for, like "I'm sorry". To me, "I'm sorry" means I regret and apologize for the mistake or inconvenience that I personally have caused, and only that. But over a very long period of time, I've learned that you're also suppose to say "I'm sorry" to someone when something unfortunate has happened to them, like the death and illness of a friend or family member. This perplexes me to no end, because I didn't understand how you can be sorry for something that has nothing to do with you. On Merriam-Webster, it actually says it's an expression of sorrow or regret, so it includes the feeling of sadness I feel when something unfortunate happens to someone I care about. But which is it? Sorrow and regret are completely different things, unless I also misunderstand "regret". The point is, my new-found awareness of my inadequate command of language feels like if I was trying to practice martial arts underwater - the body does not go where the mind wishes. 

14/52: The life of a TA

Starting this week, it will be my 6th time being a teaching assistant. Boy, it felt like it was just yesterday that I myself was an undergrad sitting in a tutorial/discussion section, all green-horned and not having a single clue that the TA up there is probably a sleep-deprived graduate student who also probably didn't have a say in the particular class they were teaching. Had I known what I know now, having been in the shoes of a TA, I would've given myself some advice that definitely could have made life easier and made learning more fun. So that's why I will share it here, in case anybody reading is still IN undergrad, or know somebody that does.

tl;dr: The advice, essentially, comes down to twos thing: 1) TAs (almost always) want to help, but please, help me help you, and 2) don't be an asshole, TAs are humans too. I know, groundbreaking right? Some might even say that professors are also humans, but that's more of a case-by-case thing.

Also, I was flipping through my TA reviews to prepare for this upcoming quarter, so I thought I might share some of the hilarious tips for improvement (but padded with the good reviews too, in case anyone reading this might give me a job later). Click on the image to see the full comments. (Cover photo credit to Jorge Cham of PHD Comic.)

Praxis 1 - Intro to Engineering Design
This was my very first time TAing, and it was when I myself was still in the final year of my undergrad. For those of you that aren't in on the EngSci scoops, Praxis 1 is one of the most controversial and polarizing courses of the entire Engineering Science curriculum. I still maintain that it is one of the most useful classes I've taken, because it emphasized presentation and communication of ideas as an essential part of engineering, not just applied math and sciences. But some people just hated it, because they got into engineering thinking it would be fair and objective, and essentially had to take this essay-writing BS class in the very first semester. Clearly I'm biased because I liked it enough to TA for it, but it was also good money as an undergrad barely making it on my own. Anyway, getting the insider scoop on how Praxis was run gave me clarity on life, for the first time, as a post-secondary educator: no matter who it is and what profession they are in, the majority of people are JUST barely keeping their shit together (myself included), and that's not a knock on anybody. You think students hand in assignments last minute? Well, TAs and profs also make up and grade assignments last minute. In hindsight, this was probably one of the most organized classes, because there was an especially strong focus on pedagogy, and it was the main priority for most of the people involved, which cannot be said for most research professors. In any case, this class had such a profound effect on me that it's partially the reason that I still have a blog today. All students were required to have a blog in first year, which led me to continue with writing online throughout undergrad, and even when I was a TA. This class was really unique because design studios were not the same as lectures, but honestly - and this applies to all classes - all you gotta do is act like this shit is kinda interesting, and your TA will go out of their way to HELP YOU succeed. I don't have any funny TA reviews to share for this class, because I never got them, and to this day I still think to myself that I might have been fired because I was doing such a poor job, and Jason Foster spared me the misery and instead told me a grad student had to be hired due to "union reasons".

COGS1 - Intro to Cognitive Science
This was my first TA assignment in graduate school, second quarter of first year. It was also just 4 months after we first asked ourselves, in our own graduate class, "what is cognitive science?" To this day, I still don't know (refer to last week's post), but I obviously couldn't let the poor students know that. The problem was further compounded by my belief then that neuroscience is the only real science here, but again, I couldn't let the students know that. I usually tell the class that I TA, that I am here to learn from you as much as you are to learn from me, and it cannot be more true here. Aside from the material, this class was basically your vanilla first year class, the CogSci equivalent of first year biology: large lectures with multiple choice quizzes and exams. The road to success on a multiple choice exam is simple: go to the review session, and preferably with questions. If you don't have questions, go with a friend that does. Think about it: why the hell would your TA volunteer like an extra hour of life to hold an exam review session just so they can be more unhelpful than they usually are? Does this make any sense? No. I don't know about other departments, but I and almost everybody I know basically run review sessions by giving out a subset of the exam questions that will be covered for sure, especially the confusing ones, which means if you paid attention throughout the whole session, and even if you forget a few things, you're guaranteed like a 80%. The only exception to this is when the class is known to weed-out students, in which case the prof might aim for a 60% average. Even then, knowing what your TA knows will help you stay on the right side of the bell curve.

"Nice and chill"

"Calm" was most definitely not how I felt.

I would give definite responses if there were such a thing in cogsci

"Get everything together." Only if getting my life together is that simple. Also, "when no one has any questions assume that they don't know and review the question anyway." If you are this kind of person: fuck you. If you have hands, you can raise them, god gave you motor function precisely so you don't have to hold other people accountable for your idiotic actions, and that's AFTER being probed many, many times about questions. Another key to success: if you have a question, ASK IT. Also, as you can clearly see, I have no knowledge of anything outside my little corner of the CogSci hexagon at that time. I might argue I still don't, but I can pretend better now. Which brings me to...

COGS1 - Intro to Cognitive Science (again)
Two years later, I got assigned to TA for the same class, except this time, enrollment was in the 400s. This huge class seriously made me (once again) doubt the effectiveness of large post-secondary institutions, because as well-intending and caring as the TAs and instructors try to be, managing 400 people is simply not the same as managing 100 people 4 times. Or maybe it is, but it just feels exponentially more difficult. By the end of the 10 weeks, I learned something about prepping for big classes like these: don't do anything at the beginning that will make you hate yourself at the end. In a small class, it's fun to be more flexible because you can go with the flow of the class, see what they engage in and what they don't, and any additional work spawned from spontaneity isn't all that bad. With 400 people, it's a different story. Even the most airtight syllabus will inevitably prompt questions from the students, especially pertaining to grades, so it is my recommendation for planning large classes that all grades, down to the percent, are set in actionable terms on the syllabus from day 1, because it is not a fun weekend getting 10+ emails from students asking about how they can earn these ambiguously-defined 2% of participation marks.

Conversely, if you're a student in a large class like this, I tend to put you in one of three categories: "under-the-radar", "grade-hound", or "seems-interested". Under the radar means I have no clue what your name is, but can probably recognize your face. Not the best, but if you know your material, it's fine, we don't ever have to cross paths. A grade hound is the person that will not engage in any other aspect of the class with the TA or instructor EXCEPT to show up with a graded assignment to ask "why did I not get marks for ______?" It's not even that showing up to ask for clarification is so offensive or anything, but if your TA senses that you have no interest in understanding why it was wrong in the first place, and just want to argue, then they're not going to want to help you. On the other hand, if you seem interested, e.g. ask a question in section once in a while, then when the time comes that you ask me why you didn't get marks for whatever, I will assume that you want a better grasp of the material, and furthermore, be more willing to see from your perspective why you thought your way. This seems very non-objective and unfair, but this is a fact of life: life is not just black and white, relationships with people matter, and NOBODY can be objective grading an essay response. Obviously I'm not looking for students to give them free marks, but if something fell in the grey area, I WILL be more inclined to assume what you wrote is a shorthand for the actual answer, especially if we've talked about it before in person. Simple. I want to help every single person get good grades in the classes I teach, but please, help me help yourself.

"Standing with his mouth hanging open"

Alright, I took shortcuts on slides for this quarter, it's true, I admit it. But I will not give up my policy of letting you teach yourselves, I just won't. I don't get paid enough for this shit, and there are 50 of you in section, somebody has to know the answer, right? But no seriously, I always try to get students to answer each other's questions, and have them disagree with each other, because more people might remember it. Perhaps I need to refine my waiting technique, maybe do a little dance or play the Jeopardy music while I wait...

COGS14B - Intro to Stats
Not much to say here that I haven't already covered above: go to office hour, go to exam review, those help. Intro level math classes are difficult to teach, because it's usually a requirement, so students are there against their will, which means there is a large distribution of skill and interest levels. Some people will fall asleep because they did this shit in high school, and some people will fall asleep because they're so far behind it doesn't event matter anymore. As a TA, the material itself is easy but teaching it in an intuitive way is not. Honestly, teaching this class helped me understand simple stats concepts so much better than when I took the class myself in undergrad. I have more photos of people sleeping in that class than I have pages of my own notes, probably. 

mildly entertaining is what i strive for in life

I'm glad SOMEBODY appreciates my blank stare and standing with my mouth open.


Like I said, you can't please everybody in an intro math class.

COGS118B - Machine Learning
For whatever reason, I got pegged to TA for machine learning, which, even though I have a background and have taken ML classes in undergrad, is so far out of my area of expertise that some of the students in this class probably can teach me (one actually did school me in linear algebra, on one occasion). These two comments basically sum up my experience:

That's all, folks. If you're reading this, you're probably a friend of mine, which means you probably won't ever get to use these advice for undergrad again, but maybe your younger sibling/cousin/kids might. The point is, most TAs I know try to be helpful, so if you need help, go to them, be it during office hours or exam reviews. No need to bring chocolate or be a teacher's pet or anything like that, just be respectful in the way you'd be to another human being, and be interested in doing well. In my own experience as an undergrad, I was usually just so behind on coursework that there was no point going to office hour because I'd have nothing to say. I still remember this one time, during like the 4th tutorial section of the ML class I took in undergrad, the TA was talking about some complicated shit like SVMs or something, and one guy strolls in, sits down, raises his hand, and says "can you explain linear regression again?" You don't want to be THAT guy, either.

13/52: Do we need a (better) functional description of the brain?

Every two weeks or so, I am faced with the innocent but extremely daunting question of "so, what do you study?" This almost always happens in an Uber/Lyft ride, and I now have a routine that I go through before carrying forward with this conversation. The routine consists of thinking for half a second whether I really want to tell this person that I study Cognitive Science, and then inevitably deciding to tell them that I study neuroscience. 9 out of 10 times, bless their hearts, they will respond with: "so you're going to be a neurosurgeon?" And I will proceed to tell them, no, you have to be a real doctor to do that.



But why do I tell my Uber drivers that I study neuroscience? Well, in part, because it's true. My day to day activities involve almost solely of trying to understand the structure and dynamics of the biological brain. But on the other hand, I tell them that because if they were to ask what cognitive science is, I would have no clue what to tell them. This is not because I'm a terrible student. Consider this: one of the most asked question within cognitive science, in my 3 years of experience being in graduate school, is "what is cognitive science?" We were asked this question during our first day of class, and we still ask ourselves this question on a weekly basis. Still not convinced? The image above is from the website of a course a professor is putting together for the new quarter, that I literally just got an email for. I don't know if other fields of research are similarly introspective, but I also don't know any other field that is as interdisciplinary as cognitive science. Whether cognitive science, as it exists today, is interdisciplinary or cross-disciplinary is up for debate, but the fact is that you often have a room of professional academics that research wildly different things. This is particularly true in the CogSci department at UCSD, but a look at last year's proceedings from the Annual Meeting of Cognitive Science Society - an international gathering - reveals similar sentiments: lacking a better word to describe it, the collection of these papers seems pretty random. This rich multidisciplinarity, though, is not the cause for the lack of identity, but another consequence.

A mind split
Digging a bit deeper, you will recover a theme in that collection. Cognitive science, to be sure, had its glorious days. About 30 years ago, CogSci gave birth to some truly avant-garde work in psychology, neuroscience, and as many will know, artificial intelligence, specifically in the form of artificial neural networks. In a way, CogSci was just as blended as it is now, but for whatever reason, in the 30 years that span the creation of the department at UCSD and today, it has lost any semblance of a coherent identity. The reason for that, I believe, is the discretization of the different aspects of the "mind", or cognition. Ultimately, if you ask a cognitive science to give a single and definitive answer as to what cognitive scientists study, it would probably center around the study of the mind, within their specialized context, like learning, social interaction, or spatial navigation. Some might say it's the study of "intelligent systems", but pushed hard enough, intelligence is ultimately measured in a mind-like way. You probably wouldn't get this answer unless you put a gun to their heads, though, because "mind" is a fluid concept that is intuitive but very hard to define scientifically. 

Anyway, 30 years ago, a hodgepodge of smart people getting together to study the mind works because the specific details haven't been mapped out yet. Our understanding of the brain, as well as other intelligent systems, was poor, and creating a computer program that mimics some aspect of human behavior meant you were getting somewhere in figuring out the mind. It was the corollary to the idiom: what I cannot create, I do not understand - what we can create together, we might understand. Fast forward to 30 years later, I think our understanding of the different - physical - components of the "mind" is too detailed for abstract conversations across different perspectives of looking at the mind. At the same time, it is not yet detailed enough, largely speaking, to support an actual rigorous scientific discussion linking one field to another. Neurons in the brain are not the same as neurons in a deep network, they just aren't. This exposes a fundamental problem in the way we have been looking at cognition, and it's that "cognition", in its essence, is a functional metaphor that we have been using interchangeably to describe different natural phenomena, which is losing steam in the face of better and more faithful metaphors - scientific models - of these different systems. 

Cognition as a metaphor
One of the most fascinating things I've learned in my 3 years here is how completely different physical systems can exhibit similar intelligent behavior. For example, slime mold is a kind of amoeba-like organism that, as a group, can solve some very difficult geometrical problems. On a bigger scale, cells in our own bodies organize together to form very intricate systems, like the brain, that enable us humans to become pretty smart. And at the biggest scale, societies are made up of millions of individual persons that also, somehow, give rise to behaviors that are way more complex than the linear combination of their constituents, phenomena like cultural norms and macroeconomics. In a particular department of cognitive science, you might find professors that study each of those things, and rightfully so, because they all study some mind-like, intelligent phenomenon that ultimately seems orderly but is difficult to characterize. Even more incredibly, the mathematics and other tools they each use can be identical, tools like dynamical systems, graph and network theory, etc. However - and I'm sure I will look back on this in 3 years or so and feel stupid - I think there are a few pressing issues we need to address before acting as if everybody has a common goal.

Currently, my personal opinion is that cognition is a functional metaphor that we, humans, have imposed on natural systems all around us, kind of like an anthropomorphization of purely physical things. At the core of it, it's rooted in our own consciousness and belief that we have an intelligent mind, and our current understanding of the human brain fuels the belief that the brain functionally gives rise to the mind - which is a functional account of the brain. Of course, that's not strictly false, because taking away the brain will obviously also take away our ability to perform certain behaviors, like seeing, reaching, and scrolling through Facebook. But at the same time, it's a metaphor in the sense that the physical behavior we are able to produce is a natural, physical consequence of something the brain is doing, not a functional consequence. I find this concept to be most easily distinguished via perception and action, and most confused when thinking about "thinking". For example, muscles in your arm are innervated by neurons in your spinal cord, which are in turn connected to motor neurons that reach down from your brain. It is therefore not very difficult to think of the whole chain as a purely physical system: neurons in the brain fire, arm moves. Not that the problem of how your brain makes that happen in such a precise way is not extremely difficult, the point is that we can think about this whole sequence of events just by tracing the ions that pass through layers and layers of cell membranes, bypassing any need for "cognition". The problem arises, though, when we want to talk about WHY you decided to move your arm in such a way. There, we are stuck with an extra layer of metaphor: the brain gave rise to the computation required to initiate this volitional movement. And if we're not careful, we would conclude that that's the brain's function! This seems both extremely intuitive but also extremely absurd. After all, we would never say the function of electrons is to pass charges from the battery to your phone screen! No, we simply describe electrons for what they are in the entangled physical system that is your phone. 

What is the right metaphor, then?
To be sure, an "electron" is also a metaphor. It's a scientific model of a phenomenon that is able to consistently describe, and more importantly, predict, things that have happened in the past and things that will happen in the future. These metaphors are updated constantly when we discover new facts about the world through scientific experiments. In that sense, neuroscience (cognitive neuroscience aside) is an attempt to describe the brain with these kinds of purely physical metaphors: ions, membranes, circuits. We can, and I think we should, attempt to describe the nervous system as completely as possible at this purely physical level, through metaphors that describe its structure and dynamics. This doesn't mean we should not care about behavior. In fact, we can easily combine this physical system with those for which we already have working, descriptive metaphors. Really, we've been doing this for organs in our body for a long time, by studying their anatomy and dynamics, like the lungs and the heart. But for whatever reason, we feel that there is more to the brain. Again, personal opinion, but I would say this is probably the disastrous result of the rushed marriage between psychology and neurology: we've always wanted to understand ourselves more, from a philosophical perspective, through the study of psychology, and we accidentally stumbled upon the fact that people missing parts of their brain also have psychological abnormalities, so why not put two and two together, nevermind the possibly uncrossable chasm between these two metaphors?

So what good is the metaphor of the mind, scientifically? For some, it is an useful abstraction that generalizes over a set of similar phenomenon. For example, "memory" is a shorthand for whenever we are able to recall a previous experience, be it what color shirt you wore on Tuesday, how your lunch smelled, or how ecstatic you felt after passing your exam. Judging by the current state of things, I don't know whether these abstractions are more useful or more hurtful for scientific progress, at least in neuroscience. Because of the vague, non-descriptiveness of "memory" as a metaphor, we've had to come up with more discrete versions, like visual memory, auditory memory, affective memory, working memory, short term memory, long term memory, episodic memory, procedural memory, and the list goes on. But then why not study the brain as a purely physical system coupled to the rest of your body? After all, it would have tremendous practical utility in clinical applications, like for treating those with neurological and psychiatric disorders. My speculation is that using words like these allows us to tie the science of cognition back to our daily experiences, which is ultimately what we're interested in. Some people are truly fascinated by how a specific ion channel contributes to the dynamic of a nerve cell, to which I say: "good for you, we need more people like you." But I would say most people interested in cognition - scientists and the general public - are those that are in it for the philosophical reason, that perhaps by characterizing it scientifically, it would give our experiences regularity and meaning, rhyme and reason for their existence. In fact, I would say that the current infatuation that society at large has with neuroscience is really a misplaced fascination in psychology. Not that there's anything wrong with that, people care about things that are relevant for them, and neuroscience can contribute to that in two ways: understanding brain diseases, and understanding our daily mental experiences. Except, now I'm not so convinced that it can ever do the latter.


12/52: GET OUT

March 26, 2017

Obviously, spoiler alert. If you haven't seen Get Out and are intending to see it, stop reading. If you have (or don't plan on doing so), do continue and let me know if you took away any of your own metaphors.

tldr: watched a horror movie; shattered my sense of identity; can't imagine the shit black people have to deal with; I will start wearing a metaphorical tinfoil hat from now on.


I don't usually watch horror films, and when I do, I almost never get creeped out the way they are intended to creep you out. Of course, something suddenly popping out of the screen gets a nice squirt of adrenaline into my veins, but afterwards, I never think twice about the movie. I think this is because I'm a very literal person, and for a horror film to take hold of you after you go home and make you run up the basement stairs after the lights go off, you have to buy into the metaphor. Usually, the metaphors themselves are very literal: a supernatural being is hiding in the dark; a serial murderer is looking for you; fate has marked your preeminent and violent death. I was baptized in the church of physics, so ghosts and grungy girls groping me through my television are more laughable than gruesome; and to not die at the hands of a serial murderer, just follow your common sense, run the hell out of there, and don't trip. In short, either I haven't watched a lot of quality horror films (I haven't), or they're just not my thing - which was why this movie gave me extra creeps.

To be honest, throughout the film, I thought it was just okay: there were good comedic moments, and the horror/racial tropes were cringeworthy enough to be enjoyable. But as a whole, my literalness fixated on the horror being conducted on screen, namely, white people displacing the souls of black people and taking over their bodies as their own. Neither the hypnosis nor the neurosurgery made a drop of sense, and so the whole plot was reduced to a complicated and farcical story to string together an array of racial jokes, as a way to flip off well-intending and fully racist white folks alike. Don't get me wrong, none of that is a bad thing. I can fully get behind ironic and comedic exposure to the ways minorities are mistreated, but I suppose I was unsatisfied because the film wasn't any more than watching a bunch of Key and Peele sketches in one sitting, which, again, while enjoyable, maybe doesn't deserve its outstanding Rotten Tomato ratings. As Chris is saved by his goofy friend, my relief for our protagonist is mixed with confusion and disappointed. Had I just not gotten why everyone loved it so much? Maybe I was aware enough of the ways black people are (mis)treated to be mildly entertained as oppose to enlightened? But as the last 10 seconds of the film unfolded, as said goofy friend backed the car out, the screen fixated on Chris' face: it wasn't a face of relief, knowing that he had been saved from the final horror. It was a face of horror itself, of confusion and uncertainty. And somewhere in that last 10 seconds, it suddenly hit me like a punch in the face, time stopped and I was suddenly hyper-aware of me and my surroundings as the lights cut on. I think I got it. Or my version of "it", whatever it is.

My family moved from China to our current home of Toronto, Canada in 2002, when I was a young Chinese boy 11 years of age. Somehow, we stumbled into an upper-middle class neighborhood around Sheppard and Bayview. (If you don't know what that's like: we had an upscale mall with no McDonalds but an A&W.) Of course, we had close to no money (or so I believe, since they never bought me those goddamn YuGiOh cards), and lived with another Chinese family in an apartment suite meant for one small family. The adults took up the two bedrooms, which in my memory was really only big enough for a bed, which I had to sit on to use the desk because there was no room for an extra chair. The two boys - me and the other family's son a year older than me - bunked in the living room. Not as a den-turned-bedroom thing that people living in downtown condos do now, but just two beds in the living room next to the TV. We had become familiar with two other Chinese families in the apartment complex and that, along with our roommate family, was my parents' closest acquaintances for those couple of years. I write all of this not as a sob story demonstrating how poor and unprivileged we are, but to say that I had absolutely no clue any of this was abnormal in any way. Later on in life I would learn that this is actually a common experience for Asian technology-immigrant (as opposed to investment) families, but that was not what made the situation normal for me. For me, it was normal because I was at an age trying to normalize absolutely everything I could around me. 

There are, however, more foreign things that I've normalized and normalized myself into that I never fully consciously chose. The natural consequence of living in an upper-middle class neighborhood in Canada was that I lived around and went to school with predominantly white people. Not hipsters or hippies, and as far as I can tell, not Canadian rednecks. Just regular, normal white families in faux-suburbia. I still don't fully understand the struggle to fit into an entirely different world as a budding teenager or an immigrant, but I imagine it was pretty confusing, and I was trying to soak up as much white culture as I could. For whatever reason, I realized that to be accepted in this new social hierarchy, I had to disassociate myself from friends that are "too Asian" as well as not act like such myself. There were, of course, Asian people in the community, but only the westernized Asians had any luck in being integrated into the vicious world of middle school preteens. And so, I tried to make white friends, did white people things, liked white girls, and tried to dress like white boys (who, ironically, at the time, tried to dress like black rappers - 3XL single-colored t-shirts and pants at the waist.) My parents probably thought I was messed up, but I think they were just happy that I got on okay with school. Thinking back, I've made some life-long Asian friends throughout my life that were "real Asians" deep down but were just as good at pretending that they were not. In any case, I had always chalked that weird period up to being a child desperately adjusting to a situation that would've been formative even for a native child, and while that WAS the reason, I never really considered the consequence that that served as the pervasive foundation of my new found identity. Thank god for Tupac and Eminem for intervening, but that's a story for another day.

There is an implicit and unwavering certainty in my dedication to studying the brain from a biological and physiological perspective, which is that I believe changing any aspect ourselves must first come from, and is a result of, a change in the physical substrate of our self, the brain. That certainty, while almost certainly scientifically correct, made me blind to the fact that changes to the brain and our fundamental identity can occur without intentionally altering the brain via mechanical (surgery), electrical, or chemical (drugs) interventions. The magical thing about how we - all organisms with nervous systems - have evolved is that we are able to receive sensory stimulus through our eyes and ears and change how our brain is wired without any conscious intervention. I don't know how, but this was the realization that came to me in those last 10 seconds of the film, and I suddenly understood our protagonist's face of horror and uncertainty as a reflection of the terror that, after having been hypnotized repeatedly, would he ever be the same person again? The metaphor of the brain surgery and hypnosis-induced burial of self-identity materialized itself for what it really stood for: unconscious integration of an external culture into our own identity. Having the shot of adrenaline definitely amplified this paranoia, but in that moment, I sat there in the theatre, a single Asian person surrounded by matrices of white movie-goers on a Saturday night, wondering to myself: who the fuck am I? And how do I make everybody else get the fuck out of my head?

For those last 10 seconds of the film, and the credits afterwards, my face probably mimicked Chris'. To put it less dramatically, I was scanning through every single one of my preferences and aversions - food, fashion, music, values - for the subconscious infiltration of Western (white) culture. To put it more dramatically, my sense of identity was shattered. After the adrenaline washed away, I was less paranoid and I could entertain the thought of watching a TV show or reading the news again, but the question remained: which parts of my "self" had I unknowingly accepted as my own, subconsciously brainwashed into buying into? And which, more importantly, at a cost of my previous - original - cultural identity? That was my interpretation of the metaphor the film ultimately tries to impress upon us, in addition to, and almost orthogonal to, the literal racist treatments that black people receive from suburban white people. What sort of ideologies and qualities had we idolized without consciously choosing, because the dominant social structure dictates so? I use "we" very loosely here, because, obviously, I'm not black. In fact, I won't really talk about the message as it is intended for African Americans, not because I don't sympathize or think it's important, but because I don't have the authority or experience to speak on it. I can, however, talk about my experience as an Asian immigrant. And while there is in no way as complex nor dark of a history (or current social and political climate), I think my experiences were similar in that they were different from the predominant acceptable way of living life in North America, if not in the difference itself.

So who am I? Well, I'm not sure. Taken to the extreme, we are of course (mostly) a sum of our experiences. And while being situated in a Western culture can immerse us in one set of subconscious routines, our parents, friends, and people we meet also impart us with their own, sometimes inherited from THEIR own culture and power structures. The difference is, though, your parents have your best intentions in mind when teaching you values or how to represent yourself, and there is usually a rich cultural history behind those set of customs. The other? Maybe not so much. At the same time, I'm not saying any single white person is at fault here: not my neighbors and classmates who acted and dressed a certain way because it was their culture, and not even those who made fun of mine because it was foreign and different, because that's what people do. Nonetheless, it shocked me to realize that a lot of things that I thought were "normal" not because they were, but because of the environment I was in. I'll give you an example: packing sandwiches for lunch. My parents were of course perplexed with my request for sandwiches as my lunch items, instead of leftover rice and vegetables. And I requested them because that's what "normal" kids did, and if I wanted to be cool, I had to be normal first, so sandwiches please. I don't even like fucking sandwiches, but to this day, I still think that's the normal item of choice for lunch. That is, of course, a harmless example, but there are examples that are not so innocent, like subconscious values or the way we try to shape our mental and physical selves. Again, I don't want to do a disservice to the way this film aims to represent the black struggle, but a simple Googling should suffice your further curiosities.

I will end with this, and I will try to not sound like a paranoid conspiracy theorist: if anyone is concretely at fault here, it is the media. Again, I'm trying to not sound crazy here, but in my opinion, that is the reason why the takeover process in the film is completed via sitting in front of the television, and why Chris was watching TV when his mother died - it's a metaphor for foregoing his own (root) identity because he was locked into something that told him what he should be instead. I've understood for some time now that how media portrays something influences the way we think about that thing. But last night was really the first time that I had the jarring realization that the way media portrays something influences the way we ARE, through influencing how we think about ourselves in relation to what is portrayed as "normal". This, with full respect to the ways it could be and IS used to propagate dominance over racial and cultural groups, goes beyond just that. This is not just a black vs. white people thing, this is those who have control over content vs. those who don't.

Incredibly, the theme of this movie converged on what we've been experiencing for the past little while now, with reference to our current political "situation", that hyper-connectivity benefits those who control the communication channels the most. I'm an independent content creator and I have a blog, which means that there probably will be 3 times as many people who read it than had I not been able to share it on Facebook and Twitter. A step up, if I were a consumer product conglomerate, I would be able to buy an ad that lets 1000 times as many people see it by recommending it through Facebook or Google Ad. But if I was really, really rich, I could, theoretically, tailor your online experience exactly the way I want to. Post election night, we focused a lot on "echo chambers" within our own social networks, and that is obviously one component of the bigger problem. But we can't ignore the fact that somebody actually has control over this stuff, be it social media or actual media. Mark Zuckerberg (and his engineers) probably didn't actively influence the way Facebook bubbles shaped as much as Fox News worked to actively represent things in a certain light, but that doesn't mean it's less dangerous. If anything, the amorphous and invisible forces has no one to answer to and thus has no need for moral judgement.

Like I said, I considered stop watching TV and getting off social media altogether last night. Now that nerves calmed, I think trying to be more aware is probably a good first step. Going back to my conflict between being Asian and white: the infiltration of my mind by invisible social forces is jarring and unpleasant, and I'm not even that whitewashed. But on the flip side, mixing of cultures with white people, black and brown people, and other Asian people is part of what makes being an immigrant life beautiful in its own way (so says Mei, anyway). For now, I will probably just try to find my roots in soup dumplings and less tanned skin - fuck, I think Asian people like fair skin because of Western media influences in Asia.

Alright, tin foil hat back on.

11/52: Cycles & Sublime

Mar 23, 2017 (For week of Mar 13)

Our hearts beat every second or so, again and again, but those microscopic loops are unraveled to become a line connecting the previous moment and the next, making a minute, an hour, a day. From the moment we regain consciousness, up to the moment we drift off to sleep - day after day - we slowly crawl forward from one time unit to the next. Our clocks are circular, perhaps only because it is more mechanically efficient to make them so, since for us, passage of time through life itself is a journey with a well-defined direction - forward. If we ever come to know the time of our own demise, or that of our universe, we will start making clocks that count up, or down, towards it; I am sure of it.

The day starts and ends with sunrise and sunset, and in between them, a day becomes a dot in the line connecting months, years. Somehow, we've done an remarkable job in unfolding cyclic and non-linear processes into a line - such is the curse of time moving forward, or rather, of how we experience it. But when I stop to experience the day at these most crucial moments, moments separating day and night, one dot from the next, time itself seems to suddenly fill with material, the moment becomes palpable - sublime - out of thin air. In these brief seconds before the sun falls below the horizon, the sky explodes with what seems to me like the set of all possible colors, and a paradox. The paradox is that while the moment itself being so brief, when measured by the turning across equidistant ticks in our machines, it has never failed to feel like an eternity. I'm never sure of the theatrics that I'm watching (or participating in?), or what exactly I'm anticipating, but I'm anticipating something. Am I rooting for the immaculately shaped fireball to disappear, or to stay? When it does quietly sneak away, though, I feel an even stranger sense of satisfaction, as if I've experienced the most climatic finale of a heroic tale. The story ends, and starts again tomorrow.

I don't do this often, but being on the edge of the continent overlooking the Pacific is a privilege I have learned to not take for granted. But it was only until very recently did I witness the opposite: the sun peeking through the emerald surfaces of the Caribbean Sea. Actually, it is the moment itself that gives the sea its crystalline turquoise. The upside-down birth of the sun itself gives birth to all that we see. More, in fact, as the golden threads weaving these clouds only reveal themselves for this tiny moment, and then they disappear.

And it was in that moment I realized that the story never ends: whenever and wherever there is a sunset, there is always a sunrise at that exact same instant for someone else on this Earth. There is no beginning nor end, yet it always tells a story. Or rather, we always tell a story about it.