Must Read

Beyond End-To-End: unveiling the Quantum threat to Encryption

If you’ve ever used Whatsapp or Instagram to communicate with friends and family, you’d notice that the messages are “end-to-end encrypted”. Upon first notice, it sounds great. All your messages are safe and secure – you’d think. 

However, not every encryption method is created equal, and with the rise of cyberattacks and more sophisticated technology especially in the Quantum field, one must exercise caution when choosing the right tools to use. But to better understand the scale of this issue we must first address the mathematical operation that makes such risk feasible in the first place.

Shor’s algorithm poses a major threat to security provided by current industry-standard encryption methods like RSA and ECC which rely on the difficulty of factoring large integers for security. However this difficulty is limited to the classical world of computing, where operations would be trialed one by one until a solution is found (exponential time) making it almost impossible to decipher such encryption methods. On the other hand, a Quantum computer is able to simultaneously compute all the possible trials in a single iteration due to it being in a superposition of exponentially many states – achieving rapid polynomial time. In simpler terms, many of the “asymmetric” encryption methods are at risk.

Evidently, this causes a domino effect on Symmetric encryption methods, since most Symmetric keys are exchanged between users through an asymmetric exchange process, which could be compromised by Shor’s algorithm allowing potential decryption of all data encrypted with that key: including your texts and photos.

Whilst this threat isn’t currently feasible for ordinary individuals — since Quantum Computers are costly, sophisticated pieces of technology –  many countries and researchers are becoming increasingly aware of its uses and have created their own. Evidently, there is an imminent risk that Quantum threats may have the potential to escalate cyberattacks and transform the digital landscape as we know it. 

Moreover, some authorities and individuals are adopting a technique called “Harvest Now, Decrypt Later”: accumulating databases of encrypted information. In hopes, it could one day be decrypted with sufficiently powerful quantum computers. 

Evidently, many companies and researchers (including NIST) have taken measures to enhance encryption methods and implement Quantum safe or secure encryption in their communication protocols. One example, is the open-source messaging platform signal, which introduced the new PQXDH encryption protocol that claims to be quantum resistant to current advancements in the field of encryption: however, they claim that such technology must be upgraded as future findings and vulnerabilities may require additional security adjustments. If you wish to, the whitepaper for the encryption method can be accessed here.

Conclusion

Finally, we realised that such advancements pose a monumental risk to information security. Although it’s easy to be pessimistic about such advancements, I believe that it’s a step in the right direction towards safeguarding our digital security and communication. Therefore, as individuals and organisations alike we must take proactive measures:

  • Stay Informed: Keep abreast of developments in quantum computing and its implications for encryption. Awareness is key to making informed choices.
  • Quantum-Safe Encryption: Consider adopting encryption methods that are resilient to quantum attacks. New cryptographic standards, often referred to as Post-Quantum Cryptography (PQC), are being developed to address this specific concern.
  • Advancements in Technology: Support and invest in technologies that stay ahead of the curve (especially open-source projects), continually updating encryption methods to withstand emerging threats.

Sources

https://csrc.nist.gov/projects/post-quantum-cryptography/
https://statweb.stanford.edu/~cgates/PERSI/papers/MCMCRev.pdf
https://purl.utwente.nl/essays/77239/
https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/security/encryption/what-types-of-encryption-are-there/#:~:text=There%20are%20two%20types%20of,used%20for%20encryption%20and%20decryption.
https://signal.org/docs/specifications/pqxdh/

500Hz Monitors: Who’s buying them?

When it comes to monitor refresh rates, higher is undoubtedly better. But, complementing this idea is a trend of diminishing return, where the benefits of higher refresh rates start to become less impactful.

“Frametime” refers to amount of time it takes for a screen to refresh its display once. There’s an inverse relationship between frametime and refresh rate; as the refresh rate increases, the frametime decreases. More precisely, we can use the function f = 1000 ms / x Hz to model the frametime (in milliseconds) based off the refresh rate (in Hertz).

Notice that while upgrading from a 60Hz to 144Hz results in a frametime improvement of 9.73ms, a further leap from 144Hz to 240Hz offers a lesser impact of only 2.77ms. This is the reasoning behind why humans perceive the transition from 144Hz to 240Hz as less noticeable than that of 60Hz to 144Hz. And as we continue to increase the refresh rate, this trend of diminishing return only becomes more significant; the difference that consumers will perceive decreases as they upgrade their display further.

Standing at the highest end of today’s monitor market are 500Hz displays. So the question raises: with a benefit so seemingly insignificant, who exactly is the target audience for these monitors?

Well, while diminishing return is unavoidable, the degree to which consumers will notice improvement can vary. Picture two very different activities: typing on a document and playing a first-person shooter. When a user is typing, only a small portion of their monitor screen will update at a time. On the other hand, while playing a first-person shooter, nearly the entire screen is in constant motion and updates rapidly, effectively causing more disparity between separate frames. For this very reason, users who perform activities that demand a significantly and dynamically updating screen will benefit the most from a 500Hz monitor. 

(Most frames do not significantly change) (source: TrevorTrac)

(Most frames do significantly change) (Source: MP1st)

Additionally, fully utilizing a 500Hz monitor requires a video game to consistently run at or above 500fps(frames per second). And running these applications at such high framerates requires a higher-end system.

Furthermore, even the most powerful hardware can only run a handful of games above the 500fps threshold. Most titles that consumers play are recent releases; developers intended these games to run at 60fps on upper-middle level consumer hardware. Despite this, there are a few exceptions that can run at 500fps with reasonable settings. Video games such as Valorant, Minecraft, and older first-person shooters are among the select few which can fully utilize a 500Hz monitor.

The improvement from 360Hz to 500Hz is not nearly as significant as past generational leaps. But there still exists a niche userbase for whom 500Hz monitors tangibly benefit: users whose games 1) frequently and significantly update most of their screen and 2) can consistently run at or above 500fps on their system.

Synthetic Biology: A Brave New World of Cures and Cautions

As a recent and ever-changing form of medicine and science, synthetic biology is paving the way for the future of medicine. Defined as a “research and engineering domain of biology where a human-designed genetic program is synthesized and transplanted into a relevant cell type from an extant organism” (A.M. Calladine, R. ter Meulen, 2013), synthetic biology offers possible solutions to some of society’s most pressing medical issues. Through DNA sequence modification and genome editing, scientists have been able to edit genetic material in living organisms with tools such as CRISPR (Clustered regularly interspaced short palindromic repeats). This ability allows scientists to provide organisms with genetic tools that nature has not yet apportioned. CRISPR also allows for the creation of ‘living therapeutics’ and introduction of immunity cells into the human body. 

So, what does this all mean? Well, synthetically creating genetic tools has already allowed for a breakthrough in different areas of production, such as the ability for silkworms to produce spider silk, as well as genetically engineered food, such as cheese, plant-based meat, etc., some of which are already available on a market scale. This provides society with a more sustainable way of creating different materials, which may be necessary as we continue to experience the impacts of consumerism on our planet’s environment. Living therapeutics and immune cells can help treat patients with various diseases, including multiple forms of cancer, providing them with a better chance of recovery and survival. Synthetic biology also assisted in the mass production of certain COVID-19 vaccines by manufacturing the SARS-CoV-2 genome sequence. 

It’s clear that an abundance of benefits derive from the usage of synthetic biology. Consequently, as with most technological advancements, there is also a profusion of risks. A majority of these risks appear to be ethical and extremely dangerous. According to The University of Oxford, synthetic biology, although promising, gives biologists a concerning way of ‘playing god.’ Misusing synthetic biology could potentially destroy existing ecosystems and undermine our crucial distinction between living organisms and machines. The loss of this distinction could be catastrophic for humans’ view on the importance of different organisms and creates an ethical concern of prioritizing machines and technology over nature and living organisms. Synthetic biology also introduces the risk of the synthesization of known human pathogens, such as Influenza or Smallpox, which could be released in much more dangerous forms than what they currently are. Although some of these associated risks are unlikely, the potential danger they inflict could be devastating. 

When considering the sad reality of human greed, it is essential to question whether the findings of synthetic biology will continue to be used for good. If put into the wrong hands, the technology could cause the decimation of multiple existing species, ultimately jeopardizing the balance of our ecosystem. Synthetic biology also poses the genuine risk of bioterrorism, as creating hazardous and genetically mutated organisms could be maliciously and violently released. Control of this technology is seen more in richer first-world countries, creating an inequality regarding access and usage. This gives certain countries, such as the U.S., an extensive scientific advantage over other countries, which could be used at the expense of other nations. 

It is still being determined what the future of synthetic biology holds, but it is imperative that both the benefits and drawbacks are considered. Naturally, we hope synthetic biology continues to be used for the greater of humankind, but that could very easily and swiftly change. Therefore, and when considering that we are already in the midst of multiple ethical, moral, and environmental crises, it is necessary to be aware of the information we consume and promote, specifically regarding the ongoing evolution of technology and science. 

Sources

Preliminary 2023-2024 Winter Outlook

Although the start of winter is still far away, it is never too early to start looking at some indications of what the upcoming winter might bring. A notable development from the spring into the summer of 2023 was the formation of an El Niño pattern, which is a major change from the La Niña pattern that has dominated the globe for the past few years. An El Niño pattern typically favors above average sea surface temperature anomalies in the Equatorial Eastern Pacific, around the Galapagos islands, which in general leads to more thunderstorm activity and less nutrient-rich waters. The warmer these waters are, the stronger the El Niño tends to be, currently, the El Niño is considered to be a weak El Niño.

Image obtained from Tropical Tidbits

The general pattern for an El Nino features the Pacific Jet cutting across the southern tier of the U.S. providing frequent storm activity and generally cooler conditions, while much of the northern tier experiences warmer than average temperatures with the lack of frequent storms. The Ohio Valley also tends to be drier than normal away from the active jetstream track to the south. 

Image obtained from PowderChasers

However, the El Nino is already in a weak phase, and is expected to intensify as winter approaches, with the NOAA providing roughly 90% probabilities of the formation of a Moderate to Strong El Nino by the winter of 2023-2024. Many forecast guidance models support this understanding with the continued increase in SST temperature anomalies in the Eastern Pacific.

All images obtained from Columbia University

With the high chance of development of a moderate to strong El Nino by the Winter of 2023-2024, it is not only important to examine long range forecasts, such as those provided by the NMME, CanSIPS, and CFS, but also look at historical climatology for weak and moderate-strong El Nino episodes. Weak El Nino’s historically from December to March tend to favor below average temperatures for the Eastern two thirds of the U.S., while moderate to strong El Nino’s tend to favor below average temperatures across the Southern tier, and above average temperatures across the Northern tier. Precipitation from weak El Nino’s tends to be below normal for the Southeast, Midwest and Southern Plains, while above average precipitation is favored for the West Coast. Precipitation from moderate to strong El Nino’s tends to favor above average precipitation for the South, East Coast, and immediate West Coast, and below average precipitation for the Ohio Valley and parts of the Northwest. 

Weak El Nino (1981-2010):

Images obtained from weather.gov

Moderate to Strong El Nino:

Image obtained from weather.gov

With all this taken together, Stemify has created a preliminary 2023-2024 outlook for both temperature and precipitation. Overall, below normal temperatures are expected for the Southeast, Southern Plains and into parts of the lower Ohio Valley, while above normal temperatures will extend widely across the Northern tier, particularly across the Upper Great Lakes, Northwest, Upper New England, and Alaska. In terms of precipitation, above normal precipitation is expected along the Gulf Coast and East Coast, with the bullseye targeting portions of North Florida and South Georgia. Below normal precipitation is expected across the Northwest. 

Images created by Navam Arora

Disclaimer: The Climate Prediction Center also creates similar seasonal outlooks but this outlook is not solely based on that of the CPC, the CPC outlook can be found here. These outlooks are only used for generic purposes and simply depict trends overtime, not day-by-day forecasts. Errors are possible in these outlooks due to the high amount of uncertainty of forecasting months in advance. Most data is obtained from Tropical Tidbits.

Who Would You Trust More: AI or Doctors?

For as long as the profession existed, doctors have been working diligently to perfect their craft and refine any rough edges, diagnosing, treating, and eventually curing their patients in the most efficient way possible in their eyes. However, mistakes are frequently made: medical malpractice is the third leading cause of death in the United States, with over 250,000 deaths occurring yearly. Despite the rigorous education doctors undergo to officially practice their craft, they too still make mistakes. It’s human nature to err sometimes, even in life-or-death scenarios. For the majority of time, it appeared as if this was just a sacrifice that had to be made to keep one of the world’s oldest, and most vital, professions stable. 

But what if the risk of human error was eliminated by having humans removed from the equation when it came to distributing medical care?  This would dynamically pivot the medical industry and the person-to-person interaction we all know today, in a completely different direction. Some speculate that this is possible, through the utilization of artificial intelligence (AI). 

Artificial intelligence has permeated throughout the medical field briefly, but it’s been shut down due to a variety of complications, whether it’d be availability, cost, unreliability, or a combination of these factors (among others). This was especially true of Mycin, an expert system designed by Stanford University researchers to assist physicians in detecting and curing bacterial diseases. Despite its superb accuracy, being even as reliable as human experts on the matter, it was far too rigid and costly to be maintained. Despite not being medically affiliated, Google image software is another example of just how unreliable AI is: it assessed, with 100% certainty, that a slightly changed image of a cat is guacamole, a completely incorrect observation.

However, as modern technology rapidly advances, with special emphasis on machine learning (the ability of a machine to function and improve upon itself without human intervention), some believe that AI can now pick up the slack of physicians. 

This claim isn’t entirely unsubstantiated: artificial intelligence can already assess whether or not infants have certain conditions (of which there are thousands of) by facial markers, something doctors struggle with due to the massive variety of illnesses. MGene, an app that has Ai examine a photo taken of a child by its user, has over a 90% success rate at accurately detecting four serious, potentially life-threatening syndromes (Down, DiGeorge, Williams, and Noonan). AI even detected COVID-19, or SARS-CoV-2, within Wuhan, China (the origin of this virus) a week before the World Health Organization (WHO) announced it as a new virus.

With every passing day, it appears that more and more boxes that are needing to be checked, enabling the possibility of artificial intelligence becoming a dominating presence within the medical field to become one step closer to turning into a reality.

That isn’t to say that there are issues with having artificial intelligence enter the medical industry: beyond the previous problems (of cost and unreliability) being possible, Ai being ever-changing also opens up the doors to bias, ranging from socioeconomic status to race to gender and everything in between. In addition, the usage of AI also is uncomfortable to many due to the removal of the person-to-person interaction that is commonly known to people, another big issue that needs to be addressed to ensure the successful implementation of artificial intelligence into the healthcare sector. 

Regardless of what side you are on, there is a common ground: artificial intelligence will continue to get more and more advanced. While it is uncertain as to whether the general public will want AI to replace doctors, have them serve as back-end helpers, or not exist whatsoever in the office, it is clear that artificial intelligence is a tool that has both a lot of benefits and drawbacks. Whether AI is implemented or not is a question that is left to the future. 

Quasars Show that Time was Slower in the Early Days of the Universe

Artist’s rendering of the accretion disc in ULAS J1120+0641, a very distant quasar powered by a supermassive black hole with a mass two billion times that of the Sun. Image: https://en.wikipedia.org/wiki/Quasar

(Astronomy) A team of astronomers led by Geraint Lewis, the astrophysics professor at the University of Sydney’s School of Physics, have recently proven that time in the early days of the universe, roughly 1 billion years ago, was significantly slower than time at the present day. By looking at quasars, incredibly active supermassive black holes, the team was able to determine how much the present universe has sped up compared to the distant past. This claim also buttresses Albert Einstein’s General Theory of relativity, which states that the passage of time was slower for the distant universe in the past.

Five times slower. That’s how slow Professor Lewis’ team found time to be in the universe’s earliest stage. To quote Professor Lewis: “If you were there, in this infant universe, one second would seem like one second — but from our position, more than 12 billion years into the future, that early time appears to drag.”

This discovery will have a massive impact on other astronomers. Understanding the passage of time in the beginning of the universe can help them not only figure out the endgame of the universe, but also such questions as How was the universe formed? and Are there other universes besides ours?

      A solution to the Ails of Chemotherapy?

      600,000 deaths. That’s how many casualties were estimated in 2021 by a foe we can’t so much as see with the naked eye: cancer. The dreaded illness that, since the foundation of modern medicine, humanity seems unable to tackle and extinguish permanently. Despite the advancement of technology (specifically in the medical sector), it seems as if we are a ways off from adequately dealing with it on a global scale. 

      That isn’t to say that there aren’t methods to deal with this disease. Chemotherapy for instance is one such remedy. It decimates cancerous cells, but does so with a massive risk to the body it’s done to, through also killing the necessary (good) cells humans need in the process. This treatment results in patients becoming immunocompromised. This label not only increases the risk of people contracting diseases, but it also increases the potential for these common ailments (such as the common cold or the flu for instance) to quickly turn to a hospital visit because of a life-threatening concern. 

      Described by those who administer chemotherapy as a double-edged sword, it appeared doubtful that the negative effects of chemotherapy could ever be reduced. After all, it took so long for this treatment to even be discovered according to modern medicine, reinforcing the notion that humanity’s war against cancer seems to have arrived at a stalemate.

      Then came a new discovery: stem cell transplants. This method seemed to solve the problems that chemotherapy generated by administering stem cells to the vein. This enables the cells to travel to the bone marrow and then become new cells that are necessary for human health, such as platelets (which help out with blood clots), to white blood cells (which assists the immune system and helps the body fight infection) to even red blood cells (which helps facilitate oxygen throughout the body). 

      Proponents of this method claim that this is an instrumental tool for humanity in its battle against cancer due to its ability to assist cancer patients after chemotherapy, which is widely considered to be the most prevalent form of cancer treatment. Although it may not be the final product, it does certainly pose questions that may pave the way toward achieving even more technological advancements in this war. 

      That’s not to say that there aren’t those who are against this method however. Some argue their stance as one where this treatment excludes the common man: stem cell transplants are incredibly expensive due to their highly advanced technological nature. This high price tag prevents the vast majority of cancer patients from being able to access this potentially life-saving treatment, pushing the ethical dilemma concerning both wealth and the ability to save a life (if not multiple). Others who are against this cite that it too comes with some drawbacks much like chemotherapy in the form of side effects. From bleeding to increased risk of infection (which is what it’s partially designed to combat), it too poses a set of risks that cannot be ignored in the eyes of some. 

      Image credit: bioinformant.com, depiction of stem cells.

      Regardless of your stance on this matter, there is a middle ground: this innovation, despite all of its shortcomings, has advanced the battle against cancer in many ways beyond just one. Beyond helping people achieve some sense of normalcy in their lives through alleviating the impacts of chemotherapy, it also grants hope to those who have (or can obtain) access to this treatment. Modern medicine, just like how it conquered measles and rubella and countless other diseases, will hopefully beat this one too.

      1. https://www.cancer.gov/about-cancer/treatment/types/stem-cell-transplant
      2. https://www.cancer.org/research/cancer-facts-statistics/all-cancer-facts-figures/cancer-facts-figures-2021.html