As alarming as it sounds, even a lick of peanut butter could be life-threatening. Allergies. What is it? Let’s see. Had the peanut in peanut butter been harmful to everyone it wouldn’t be called an allergy. Only if something reacts in an unprecedented way to a select few is then called an allergy.
So the question arises, How do I know if I’m allergic and what I am allergic to?
Allergies come in forms, ranging from water to even nickel coins. One can’t possibly predict what substances react weirdly with your body without ever being exposed to it. This is why allergy tests are done.
Well, Only a medical professional could let you know your allergies unless something you had eaten or been exposed to previously didn’t sit right with you. Symptoms of an allergy range from a runny nose to breathlessness and of course, the scary and itchy hives.
Let’s take a look at what the doctor is doing behind the scenes, shall we?
An immunologist or allergist usually does the test which involves a skin prick or a patch test. The image above, from Westhillsaaa, illustrates a medical personnel checking for unusual reactions in a patient’s skin through various triggers.
The tests could range from injecting the allergens into your skin from an injection to taking out a blood sample. The choice of tests varies according to the patient’s data including their medical history, their condition, and suspected triggers.
Something to note about allergies is that a person can outgrow them with time. This is commonly seen in children getting rid of food allergies but some allergies like that of pollen and medications persist for a long time or even all your life.
Although you can’t possibly get rid of an allergy that still persists in adulthood, you can take certain medications and tests described accordingly to reduce complications.
A common medication is desensitization which is basically building tolerance for your allergen by exposing your body to it periodically under small concentrations.
A personal suggestion is that you should have an emergency action plan including an EpiPen ready just in case things go south after eating/reacting to something new.
In the near future, who’s to deny that at the rate medical technology is growing, maybe we could even have a permanent remedy for allergies? That’s a topic up for discussion.
If you’re reading this right now, you have a personal stake in answering the above question, that is, how did life come into existence? From the concepts of evolution (progression of life) to the origin of innovations meant to improve the quality of living a thousandfold, this is one of the few things that has been debated by scientific communities worldwide for centuries.
Before we delve into this topic, we need to address what exactly is life for us to understand how it can be identified. Now, this may seem obvious as you can just point to yourself and shout “life!” with a fervor akin to an eureka moment, but just bear with me. There are thousands of definitions all-encompassing this topic, so we’ll cover the black-and-white one necessary to understand this article: “Life is a quality that distinguishes matter that has biological processes, such as signaling and self-sustaining processes, from matter that does not. […]” (Wikipedia, para. 1).
Now that we addressed what we’ll be talking about, let’s touch upon the ideas that have been supported by members of scientific communities. There have been a slew of these theories, each one as different as the next. Some have ignored evolutionary standpoints in favor of supporting biblical citings that God created Adam and Eve and demanded that they procreate to foster future generations. Others have speculated mind-altering possibilities like life being formed out of stardust. There are even people who give into the idea that one day life just happened to occur, defeating the odds in a 1:4^300 fashion (in short, basically a near zero possibility).
But, for all the struggles of science, there has been a rising, consistent trend about one theory that encompasses quite a couple of already acknowledged objective realities about other environments: the assembly of cells to function like a team, the concepts of natural selection and evolution, and a really, really, really small chance.
First and foremost to this journey is the building blocks. Whether it would be a massive skyscraper or the physiological needs of human beings, life itself is no exception. In this case, its building blocks are cells, or “[…] the smallest, basic unit of life that is responsible for all of life’s processes […]” (BYJUS, para. 1).
Now, the concept of a really small chance is where this first comes in. While the possibility of a human just appearing fully formed is practically zero, the chance of an extremely improbable chemical reaction occurring is considerably more likely. After all, the universe’s formation and all it entails follows a similar line of reasoning, but that’s a digression.
Then comes the theory of evolution. Although we all know that chimpanzees turned into humans eventually (displayed by likeness in genes and similarities in physical features), cells undergo the same transformation. With the help of natural selection, that is, ‘survival of the fittest’ in short, these cells were able to continuously evolve and progress until they couldn’t do so anymore.
Although these cells didn’t have as much to differentiate themselves as other fully functioning species that we see currently, they still were able to adjust through replacing what they utilized to function as a means to enhance their efficiency. For instance, cells chose to swap out its original genetic material, RNA, with Deoxyribonucleic Acid (DNA) given its improved stability and suppleness. If that’s not all, cells even incorporate other molecules like proteins to speed up the catalysts of chemical reactions, in other words, what’s the intermediary between them and them attaining their goal at performing their designated function. Their capabilities to evolve and adapt makes it known that they were the original pros at becoming the lean, mean, efficient machines that are in every living thing.
Finally, we arrive at how we went from microscopic cells to actual fairly visible matter. Although cells reach a limit when they form individually, that same limitation is overcome when multicellularity (or the combination of cells to perform a function) occurs. When this happens, cells finally quit going their separate ways and instead collide, enabling larger formations (such as complete organs) to come into existence.
Ironically, the formation of something as volatile and unpredictable as life came through a series of just as unlikely events, from extremely improbable reactions to survival of the fittest mantras in cells that can’t even speak to one another (or maybe they can and we just don’t know). But it does make some wonder as to what else has yet to be discovered? What else has yet to be formed from a series of unpredictable and yet possibly fated events?
Something you can’t see or hear until years go by. Something you recognize as simple and yet impossible to avoid. Something that is known as both the cruelest and most beautiful law in all of nature. Something that neither the richest nor poorest person can escape from. That something is time.
Throughout mankind, humans have been able to conquer just about everything, from their minuscule problems to global affairs. However, with all of our minds combined, we still failed to defeat the toughest opponent of all: time. For what seems like since the origin of the universe, it appeared as the one unstoppable force that nobody could fight.
That is until 2022. While this year beckoned the end of the COVID-19 pandemic, it also brought along news about a case study conducted by David Sinclair, a molecular biologist who spent the vast majority of his career (twenty years) searching for ways to reverse aging and undoing time in the process. While the beginning of his journey was unsuccessful, he didn’t give up.
The study split up two different mice (siblings born from the same litter) and genetically altered one of them to make them considerably older, something that was a marked success. While this alone is not indicative of a reversal in aging, it does bring up an important question: if time could be sped up, could it also be slowed down or even undone altogether? However, before we get to that, we need to understand just how the mice were genetically altered and why.
Image credit: https://www.cnn.com, depiction of two mice from the same litter being drastically different in age appearance.
Many believe that aging is caused due to cell damage, but that’s not exactly accurate. That is one of the reasons, yes, but that’s not the main cause. Instead, we should look at the heart of the matter: the epigenome. It is what determines what each cell becomes and how it works, an instructional manual of sorts for each cell. When the epigenome malfunctions, the “instructions” of the cells are lost, thus resulting in the cell failing to continue functioning.
So, Sinclair utilized gene therapy to get the cells their instructions to continue working and the results were shocking. Sinclair wasn’t only able to display success in accelerating aging, but also reversing it as well by nearly 60%. What’s more, this appears to be limitless, with Sinclair even citing that “[he’s] been really surprised by how universally it works. [Him and his team] haven’t found a cell type yet that [they] can’t age forward and backward.”
This expands beyond mice: it has already been utilized to reverse aging in non-human primates through the use of doxycycline, an antibiotic with gene reprogramming potential, with rapid success. There has even been some human experimentation, with gene therapy being done on human tissues in lab settings.
The ability to reverse aging across the board brings up more than just stopping time, it also enables the possibility of halting sickness relating to aging. In retrospect, these illnesses (like dementia and Alzheimers among others) are caused due to cell malfunction. If the reversal of aging is potent enough, it runs the risk of also undoing these illnesses.
With the potential to halt aging and enable people to live into their hundreds without fear of age-related illnesses, it does bring up countless possibilities. If we can already undo aging on a small scale, imagine what the future ten, fifty, or even a hundred years from now can behold.
Almost all infectious and deadly viruses are caused due to their RNA coding. Researchers from established research universities, such as NYU and Columbia, alongside the New York Genome Center, have researched and discovered a new type of CRISPR technology that targets this RNA and might just prevent the spread of deadly diseases and infections.
A new study from Nature Biotechnology has shown that the development of major gene editing tools like CRISPR will serve to be beneficial at an even larger scale. CRISPR, in a nutshell, is a gene editing piece of technology that can be used to switch gene expression on and off. Up until now, it was only known that CRISPR, with the help of the enzyme Cas9, could only edit DNA. With the recent discovery of Cas13, RNA editing might just become possible as well.
RNA is a second type of genetic material present within our cells and body, which plays an essential role in various biological roles such as regulation, expression, coding, and even decoding genes. It plays a significant role in biological processes such as protein synthesis, and these proteins are necessary to carry out various processes.
RNA viruses
RNA viruses usually exist in 2 types – single-stranded RNA (ssRNA), and double-stranded RNA (dsRNA). RNA viruses are notoriously famous for causing the most common and the most well-known infections – examples being the common cold, influenza, Dengue, hepatitis, Ebola, and even COVID-19. These dangerous and possibly life-threatening viruses only have RNA as their genetic material. So, how can/might AI and CRISPR technology, using the enzyme Cas13 help fight against these nuisances?
Role of CRISPR-Cas13
RNA targeting CRISPRs have various applications – from editing and blocking genes to finding out possible drugs to cure said pathogenic disease/infection. As a report from NYU states, “Researchers at NYU and the New York Genome Center created a platform for RNA-targeting CRISPR screens using Cas13 to better understand RNA regulation and to identify the function of non-coding RNAs. Because RNA is the main genetic material in viruses including SARS-CoV-2 and flu,” the applications of CRISPR-Cas13 can promise us cures and newer ways to treat severe viral infections.
“Similar to DNA-targeting CRISPRs such as Cas9, we anticipate that RNA-targeting CRISPRs such as Cas13 will have an outsized impact in molecular biology and biomedical applications in the coming years,” said Neville Sanjana, associate professor of biology at NYU, associate professor of neuroscience and physiology at NYU Grossman School of Medicine. Learn more about CRISPR, Cas9, and Cas13 here.
Role of AI
Artificial intelligence is becoming more and more reliant as days pass by. So much so, that it can be used to precisely target RNA coding, especially in the given case scenario. TIGER (Targeted Inhibition of Gene Expression via guide RNA design), was trained on the data from the CRISPR screens. Comparing the predictions generated by the model and laboratory tests in human cells, TIGER was able to predict both on-target and off-target activity, outperforming previous models developed for Cas13
With the assistance of AI with an RNA-targeting CRISPR screen, TIGER’s predictions might just initiate new and more developed methods of RNA-targeting therapies. In a nutshell, AI will be able to “sieve” out undesired off-target CRISPR activity, making it a more precise and reliable method.
Wessels, HH., Stirn, A., Méndez-Mancilla, A. et al. Prediction of on-target and off-target activity of CRISPR–Cas13d guide RNAs using deep learning. Nat Biotechnol (2023). https://doi.org/10.1038/s41587-023-01830-8
600,000 deaths. That’s how many casualties were estimated in 2021 by a foe we can’t so much as see with the naked eye: cancer. The dreaded illness that, since the foundation of modern medicine, humanity seems unable to tackle and extinguish permanently. Despite the advancement of technology (specifically in the medical sector), it seems as if we are a ways off from adequately dealing with it on a global scale.
That isn’t to say that there aren’t methods to deal with this disease. Chemotherapy for instance is one such remedy. It decimates cancerous cells, but does so with a massive risk to the body it’s done to, through also killing the necessary (good) cells humans need in the process. This treatment results in patients becoming immunocompromised. This label not only increases the risk of people contracting diseases, but it also increases the potential for these common ailments (such as the common cold or the flu for instance) to quickly turn to a hospital visit because of a life-threatening concern.
Described by those who administer chemotherapy as a double-edged sword, it appeared doubtful that the negative effects of chemotherapy could ever be reduced. After all, it took so long for this treatment to even be discovered according to modern medicine, reinforcing the notion that humanity’s war against cancer seems to have arrived at a stalemate.
Then came a new discovery: stem cell transplants. This method seemed to solve the problems that chemotherapy generated by administering stem cells to the vein. This enables the cells to travel to the bone marrow and then become new cells that are necessary for human health, such as platelets (which help out with blood clots), to white blood cells (which assists the immune system and helps the body fight infection) to even red blood cells (which helps facilitate oxygen throughout the body).
Proponents of this method claim that this is an instrumental tool for humanity in its battle against cancer due to its ability to assist cancer patients after chemotherapy, which is widely considered to be the most prevalent form of cancer treatment. Although it may not be the final product, it does certainly pose questions that may pave the way toward achieving even more technological advancements in this war.
That’s not to say that there aren’t those who are against this method however. Some argue their stance as one where this treatment excludes the common man: stem cell transplants are incredibly expensive due to their highly advanced technological nature. This high price tag prevents the vast majority of cancer patients from being able to access this potentially life-saving treatment, pushing the ethical dilemma concerning both wealth and the ability to save a life (if not multiple). Others who are against this cite that it too comes with some drawbacks much like chemotherapy in the form of side effects. From bleeding to increased risk of infection (which is what it’s partially designed to combat), it too poses a set of risks that cannot be ignored in the eyes of some.
Regardless of your stance on this matter, there is a middle ground: this innovation, despite all of its shortcomings, has advanced the battle against cancer in many ways beyond just one. Beyond helping people achieve some sense of normalcy in their lives through alleviating the impacts of chemotherapy, it also grants hope to those who have (or can obtain) access to this treatment. Modern medicine, just like how it conquered measles and rubella and countless other diseases, will hopefully beat this one too.
Scientists are currently cultivating proteins from the stem cells of livestock and poultry in labs in a bid to create more sustainable meat, but will anyone want to eat it?
Lab-grown meat, although a promising concept, has been slow to hit the mainstream. The notion is to grow meat, within laboratory conditions, by extracting stem cells from live animals and installing them into a bioreactor (vessel-like device), where salts, vitamins, sugars, and proteins are added. The oxygen-rich temperature-controlled environment allows the stem cells to multiply dramatically; eventually differentiating into muscle fibres that cluster together, aided by scaffolding material.
Numerous start-ups and companies have invested millions into this innovative technology. Eat Just, valued at $1.2 billion, was founded by Josh Tetrick in 2011, and the company began the development of lab-grown chicken in 2016. “With the aid of a 1,200-liter bioreactor, the cells can develop into meat at a rapid rate with the whole process taking around 14 days. For comparison, the production of farm-based chicken is a 45-day process”, states the CEO of Eat Just. Evidently, lab-grown meat rivals the production of farm-based alternatives; by providing a more efficient development procedure.
Currently, the meat industry slaughters tens of billions of animals every year, and meat consumption is expected to increase by more than 70% by the year 2050; according to the Food and Agriculture Organisation of the United Nations. At the current state, lab-grown meat products will struggle to satisfy these demands. To put this into perspective, to produce enough meat to feed everyone in Singapore, Eat Just would need to use 10,000-litre bioreactors, over more this process is currently more expensive than traditional farming methods. However, with increased funding, it might soon become a reality.
Despite these challenges, the advancement of lab-grown meat products will continue, promising a wealth of benefits. Lab-grown meat is drug-free, cruelty-free, more environmentally friendly, and sustainable. One report estimates that lab-produced meats could lower greenhouse emissions by 78–96%, 99% less land use, and 82–96% less water consumption. It is, without a doubt, more sustainable than traditional meat farming.
In spite of all adversities, at the end of last year, restaurant 1880 in Singapore became the first in the world to serve lab-grown meat, after approval from the country’s food agency on the sale of cultured meat. This poses as a huge stepping stone for the future of lab-grown meat. One estimate by US consultancy firm Kearney suggests that 35 per cent of all meat consumed globally will be cell-based by 2040.
In an earlier interview, Josh Tetrick (founder of EatJust) expresses, “Working in partnership with the broader agriculture sector and forward-thinking policymakers, companies like ours can help meet the increased demand for animal protein as our population climbs to 9.7 billion by 2050.”
It is beyond dispute that the status quo is not sustainable. So, do we have the appetite for change?