At least most other diseases have a clearly identifiable cause, however. Even if knowing that hasn’t helped where finding a cure is concerned, at least it provides something of a starting point.
Depression though, is amorphous. There’s no one thing that science has been able to point to in terms of causative agents, and for that reason, treatments vary wildly in their effectiveness. Sometimes they work, and sometimes, not so much. In fact, recent studies have shown that upwards of 30% of people who suffer from depression don’t find any relief at all with any of the existing treatments, which most commonly take the form of Selective Serotonin Reuptake Inhibitors (SSRIs).
The question is, why? Why do these treatments work most of the time, but inexplicably fail to produce results for everyone? A recently completed research study may hold the answer. A research team led by Professor Kenji Doya, working from the Neural Computation Unit at the Okinawa Institute of Science and Technology Graduate University has identified three distinct subtypes of depression.
Prior to this research, scientists had long suspected the presence of various forms of depression, but there was no road map to a clear consensus on the matter.
Professor Doya and team poured over every scrap of clinical data of their 134 study participants, half of whom had recently been diagnosed with depression. They relied on blood tests and questionnaires to collect more detailed information about sleep history, mental health, lifestyle choices and life stressors. These were augmented by periodic MRI scans used to study participant brain activity by mapping a total of seventy-eight different regions of each participant’s brain to examine possible connections between these areas.
Overall, the team collected more than three thousand measurable features they could use to define and classify depression, which they subsequently broke down into five data clusters. Of the five data clusters the team was left with, three of them corresponded with different subtypes of depression.
Using the collected MRIs, the team was able to predict when prescribing SSRIs would effectively treat depression. Specifically, the answer lay in the angular gyrus, a region of the brain that’s involved with attention span, spatial cognition, and language processing. When and where depression impacted this region, SSRIs proved to be an effective treatment.
The team also discovered that SSRIs were least effective against a subtype of depression triggered by childhood trauma.
The team was quick to note that they’re still in the earliest stages of their research, and that much more work is needed, but now, we are beginning to understand exactly how, why and most importantly when SSRIs are effective treatments for depression. The next step of course, will be to discover new and more effective treatments for those cases when SSRIs fail. It’s an exciting branch of study that promises to greatly expand our understanding of a condition that is still, in many ways, a deep mystery.
Coffee appears to be a wonder drug. In recent months, there has been a broad range of papers published on research hinting that coffee may help fight a variety of diseases, which is great news, considering that every day, the world consumes more than five billion cups of the stuff. Now, a new research paper joins the growing constellation of coffee research, and once again, the findings point to coffee being something of a miracle drink.
Earlier research into the matter indicated that coffee may offer at least some protection against both Alzheimer’s and Parkinson’s disease, but although the correlation was noted, a causal link was not established by the earlier work.
Dr. Donald Weaver called on biologist Yanfei Wang and Dr. Ross Mancini, a research fellow in medicinal chemistry to take a closer look at what compounds in particular in coffee might be responsible for the decreased risk. The team investigated three different types of coffee: Light roast, dark roast, and decaffeinated dark roast.
One of the earliest things they noted in their research was the fact that the protective effects of coffee were identical in the caffeinated and decaffeinated coffees, which allowed them to rule out caffeine as being the causative agent.
As the research continued, the team identified a group of compounds called “phenylindanes,” which aren’t present in coffee beans in nature, but only emerge during the roasting process. It turns out that phenylindanes have a unique property: They inhibit two protein fragments (tau and beta amyloid), keeping them from clumping. These two protein fragments are common in both Alzheimer’s and Parkinson’s.
Given the fact that phenylindanes only appear during the roasting process, the team was not surprised to discover that the longer coffee beans are roasted, the more phenylindanes are present, so the preventive effect is stronger in dark roasts than in light or medium roasts.
Doctor Mancini had this to say about the team’s efforts:
“This is the first time anybody’s investigated how phenylindanes interact with the proteins that are responsible for Alzheimer’s and Parkinson’s. The next step would be to investigate how beneficial these compounds are, and whether they have the ability to enter the bloodstream, or cross the blood-brain barrier.
Mother Nature is a much better chemist than we are and Mother Nature is able to make these compounds. If you have a complicated compounds, it’s nicer to grow it in a crop, harvest the crop, grind the crop and extract it than try to make it.”
All true, and the team is already preparing for the next phase of the research where they’ll attempt to quantify the impacts of phenylindanes on the two proteins and begin to experiment on ways of creating an extract with a higher concentration.
Dr. Weaver notes that although the early research is both promising and encouraging, the team is still years away from being able to turn their discoveries to date into anything that even remotely resembles a treatment or a cure. Even so, it’s good news, and yet another benefit of drinking that all-important cup of coffee in the morning.
Sitting is the new smoking! You’ve probably read that phrase in dozens of papers, reports and articles over the last few years. In fact, it gets repeated with such frequency that it has almost become an article of faith in the medical community, but is it true?
Not according to an international research team working out of the University of South Australia. The team’s findings were recently published in the American Journal of Public Health, and should put the matter to rest once and for all.
While their findings suggest that excessive sitting, as defined by sitting more than eight hours a day, does increase the risk of some chronic diseases and conditions by as much as 10 to 20 percent, and can even lead to an increased risk of premature death, the issue is one of scale.
Dr. Terry Boyle, one of the nine researchers on the team had this to say about their research:
“The simple fact is, smoking is one of the greatest public health disasters of the past century. Sitting is not, and you can’t really compare the two.
First, the risks of chronic disease and premature death associated with smoking are substantially higher than for sitting. While people who sit a lot have around a 10-20 per cent increased risk of some cancers and cardiovascular disease, smokers have more than double the risk of dying from cancer and cardiovascular disease and a more than 1000 per cent increased risk of lung cancer.
Second, the economic impact and number of deaths caused by smoking-attributable diseases far outweighs those of sitting. For example, the annual global cost of smoking-attributable diseases was estimated at US$ 467 billion in 2012, and smoking is expected to cause at least one billion deaths in the 21st century.
Finally, unlike smoking, sitting is neither an addiction nor a danger to others. Equating the risk of sitting with smoking is clearly unwarranted and misleading, and only serves to trivialize the risks associated with smoking.”
What we’re ultimately talking about here is an issue of scale. While saying ‘Sitting is the New Smoking’ is certainly an eye-catching headline designed to get clicks and media attention, the reality is far different, and anyone who repeats the meme is only making the job of medical professionals around the world more difficult.
It’s certainly true that sitting for extended periods is problematic, and we should do everything we can to make people aware of the health risks associated with a sedentary lifestyle. That said, fearmongering is almost certainly the wrong approach to take here.
The next time you hear someone repeating the meme, take a moment to point out the harsh realities. In isolation, it won’t do anything to counter the prevailing narrative, but as more and more people begin to set the record straight, the hope is that we can collectively bring a measure of realism to the discussion.
No matter how you slice it, sitting isn’t the new smoking. The two aren’t even in the same league.
To be sure, the advances in Biotech that we’ve seen so far have been both helpful and exciting, but they’ve barely scratched the surface of what’s ultimately possible. In large part, that’s because it’s no simple task to meld biological and technological materials, but as the science marches forward, that’s beginning to change.
New findings presented at Neuroscience 2018, however, point to further advances in the field that promise to usher in the next generation of Biotech devices. The presentation focused specifically on recent advances in connecting physical devices to neural stimulation maps, which are poised to completely transform therapy and prosthetics for people with severe disabilities.
Here are some of the highlights, to give you a taste of what’s coming:
1) Researchers have developed a working prototype of a device that combines sound cues with computer “vision” that can help the blind perform routine tasks, including locating people and specific objects in their immediate vicinity.
2) A new brain stimulation technique dubbed “DCS” (Dynamic Current Steering) has shown promising results in terms of restoring limited vision to blind people.
3) A pilot program has been launched that uses avatars in a digital environment that, when combined with real time electronic feedback has been used to improve the motor function of stroke victims, in some cases even years after the patient suffered their stroke.
4) A team of researchers has successfully created a prototype of a prosthetic hand that can provide feedback to its wearer in the form of task-related sensations.
5) Finally, Biotech companies are generally getting better at processing brain signals and translating them into computer guided hand movements. An advance that will increasingly allow people suffering from paralysis and people with quadriplegia a much higher degree of independence as they begin to integrate electrical-stimulation-based prosthetics into their daily lives.
This last item is particularly noteworthy. If you follow the biotech space at all, you may be aware of the “Emotiv” headset, which translates brain signals into control of various devices, allowing you, for example, to steer your wheelchair with the power of your mind.
As impressive as the tech was when it was first released, it had some serious limitations. Over time though, researchers are improving the mind/machine interface, and results are not only improving, but expanding into other areas, again, most notably, prosthetics.
In this regard, it’s not unlike speech to text software. When first introduced, it was good, but fell far short of greatness, being about 65% accurate when initially released. The developers stuck with it, however, and today’s speech to text programs have a 99+% accuracy. We’re seeing the very same arc of development here.
All that to say that biotech is poised for another round of game changing innovations that will bring us closer to the realization of the promise many of us saw in the industry when it first burst onto the scene, and that’s a very good thing indeed.
In January of this year, the Red Cross faced a crisis it’s becoming all too familiar with. With their resources stretched thin on account of storms battering the eastern seaboard, the organization sent out an urgent call for blood donations.
While all blood is welcomed and appreciated, the agency is especially interested in acquiring type-O blood, which can be administered to anyone, of any blood type. It’s universal, which makes it a rare prize indeed.
Recently, a group of researchers has identified enzymes from the human gut that can transform type-A and type-B blood into O up to thirty times more efficiently than previously studied enzymes. The team recently presented their findings at the 256th National Meeting & Exposition of the American Chemical Society (ACS).
Dr. Stephen Withers, a member of the research team, had this to say:
“We have been particularly interested in enzymes that allow us to remove the A or B antigens from red blood cells. If you can remove those antigens, which are just simple sugars, then you can convert A or B to O blood.”
One of the biggest hurdles that Dr. Withers and his team faced though, was the sheer number of enzymes to investigate. Here, the group turned to colleagues at the University of British Columbia (UBC) for assistance.
UBC uses metagenomics to study ecology, which is essentially a process of taking all the organisms from a given environment and extracting the sum total DNA of those organisms, all mixed together in one giant data set. This enabled the team to cast an extremely wide net, sampling the genes of millions of microorganisms without needing to isolate individual cultures.
Their wide net caught something, and it turned out that the enzyme that held the key already lives inside us, in our gut biomes.
Glycosylated proteins known as mucins linen the wall of our digestive systems, where they provide sugars that serve as “hooks” or points of attachment for gut bacteria, while simultaneously feeding them as they assist in digestion.
Some of the mucins are quite similar in structure to the antigens of type-A and type-B blood, so the team began researching the enzymes that the bacteria use to take sugar from mucin. After a short search, they found what they were looking for. A whole new family of enzymes that are devastatingly effective at devouring blood antigens, morphing A and B blood into O very efficiently.
On the heels of their successful collaboration with the University of British Columbia, the team has reached out to them again for the next phase of their research, working with UBC’s Centre for Blood Research to validate the enzymes and test them on a larger scale in anticipation of clinical testing and human trials down the road.
Dr. Withers again:
“I am optimistic that we have a very interesting candidate to adjust donated blood to a common type. Of course, it will have to go through lots of clinical trials to make sure that it doesn’t have any adverse consequences, but it is looking very promising.”
Promising indeed. And great news for our nation’s blood supply.
There’s copious amounts of evidence that fad diets don’t work. If they did, we wouldn’t have the obesity epidemic we do today. Of course, that doesn’t keep people from trying the latest diet of the moment. In fact, recent estimates indicate that some 45 million Americans “go on a diet” each year.
It’s big business, too. Americans spend more than $33 billion dollars a year on special dietary foods and other weight loss products. Again though, if it was money well spent, we’d have the slimmest, healthiest population on the planet. We clearly don’t, and what’s worse is the fact that few people stop to ask tough questions about these fad diets and what they’re doing to our bodies.
That’s changing, thanks to a new study recently completed by Professor Maciej Banach and company, of the Medical University of Lodz, in Poland.
Professor Banach and his team focused on low-carb diets, with an eye toward examining the health risks associated with them. In particular, since carbs are a major energy source, how does a diet low in them impact human health?
Although the research provides no causal answer to the question, it does examine the link between low-carb diets generally and the risk of premature mortality, as well as mortality arising from specific diseases.
The team examined survey results drawn from nearly 25,000 individuals who had participated in the National Health and Nutrition Examination Survey between 1999 and 2010, searching for associations between low carb intake and the risk of death from stroke, cancer, cerebrovascular disease and coronary heart disease.
The average age of the study participants was 47.6 years, and their carb intake was calculated as a percentage.
Based on these percentages, the study participants were divided into four groups and were followed for an average period of 6.4 years. They were also classified as obese and non-obese based on their BMI (Body Mass Index).
The research team also examined the same associations in a massive meta-analysis of a variety of related studies that examined nearly 450,000 participants who were followed for an average of 15.6 years.
The results are alarming, to say the least. Here are some of the key findings:
- The participants who consumed the least amount of carbs were also 32% more likely to die prematurely from any cause.
- They were 51% more likely to die from coronary artery disease
- 50% more likely to die from cerebrovascular disease
- And 35% more likely to die of cancer
These results were mirrored in the meta-study, although the effects were somewhat blunted. In the larger group:
- The lowest-carb consumers were 15% more likely to die from any cause
- 13% more likely to die from coronary artery disease
- And 8% more likely to die from cancer.
Professor Banach had this to say about the results:
“Low carb diets should be avoided. The reduced intake of fiber and fruits and increased intake of animal protein, cholesterol, and saturated fat with these diets may play a role. Differences in minerals, vitamins, and phytochemicals might also be involved.
Low-carb diets might be useful in the short term to lose weight, lower blood pressure, and improve blood glucose control, but our study suggests that in the long-term, they are linked with an increased risk of death from any cause…”
Not that it’s likely to change many minds, but the data is sobering indeed.
According to statistics collected by the National Institute on Drug Abuse (NIDA), here are the contours of the problem we face:
- An average of 115 people in the United States die from an opioid overdose every single day.
- In 2015, 33,000 people in the US alone died from an opioid overdose, and an additional 2 million were living with disorders related to opioid abuse.
- Nearly a third (29%) of people who are prescribed opioids for pain relief wind up abusing them.
The scope and scale of the problem is enormous, and so far, little has been done to manage the issue, but that may be changing, thanks to a dedicated team of researchers led by Professor Mei-Chuan Ko, working out of the Wake Forest Baptist Medical Center in Winston-Salem, NC.
The team has recently finished testing (on primates) a new, non-addictive painkiller known as AT-121. The results of those tests were published not long ago in the journal “Science Translational Medicine,” and they are promising indeed.
AT-121 was designed with two goals firmly in mind: One, obviously, the relief of chronic pain, and two, to block the addictive action of opioids. To that end, AT-121 is designed such that it acts simultaneously on the nocicepotin receptor, which inhibits the addictive effects of opioids, and the mu opioid receptor, which makes opioids effective in terms of pain relief.
The results were nothing short of amazing. AT-121 has pain relieving effects similar to morphine, but only requires about 1% of a typical morphine dose to achieve similar results. Even better, since AT-121 targets both of the receptors mentioned above, it manages to avoid the side effects that opioids tend to induce, including respiratory depression, physical dependence, opioid-induced hyperalgesia, and abuse potential.
As Professor Ko explains:
“This compound…was effective at blocking abuse potential of prescription opioids, much like buprenorphine does for heroin, so we hope it could be used to treat pain and end opioid abuse.
The fact that this data was in nonhuman primates, a closely related species to humans, was also significant because it showed that compounds such as At-121 have the translational potential to be viable opioid alternative or replacement for prescription opioids.”
Based on their early research, the team is forging ahead rapidly with additional preclinical trials in a bid to prove that their new drug is safe. Once the additional trials have been completed, the team’s plan is to submit their findings to the FDA (Food and Drug Administration) for approval. From there, the next stop is clinical trials in humans.
While Professor Ko’s team is still quite some distance from having an alternative to addictive opioids on the market and ready for use, their early work is beyond promising. The day may soon be upon us when AT-121 can take the place of Vicodin, oxycodone, morphine, codeine, fentanyl, and others in the opioid family that have been creating as many problems as they’ve been solving, and that is very good news indeed.
While it’s true that acne isn’t a life-threatening condition, and a cure hasn’t exactly been a high priority of the medical establishment, it’s a condition that most of us will suffer, or have suffered at some point in our lives. It’s most common in teens, but can sometimes persist into adulthood, and in serious cases, the scarring from acne can last a lifetime.
In addition to that, research has shown that persistent acne causes more than simple discomfort. It can cause sufficient levels of psychological distress that it can make people withdrawn, and self-conscious about their appearance to the point that it begins to impact their interpersonal relationships.
Current treatment options for acne include a range of antibiotics and retinoids, but these treatments are not always effective and worse, they can often cause unappealing side effects including dry skin and irritation.
That’s precisely why Chun-Ming Huang and his team of researchers have been looking into new and better treatment alternatives. After all, even though the condition isn’t life threatening, it impacts more than 40 million people in the US alone, and a not-insignificant percentage of those do not tolerate the current treatment options well.
Recently, Huang and his team published the results of their research in the Journal of Investigative Dermatology, where they explain their process in developing a safe, effective vaccine to treat acne.
Their first step was, of course, to study acne’s root cause. That part was easy enough and well-understood. Acne is caused by the Propionibacterium acnes, which produces a toxin called the Christie-Atkins-Munch-Peterson factor, or CAMP factor for short. It is this toxin which is largely responsible for the inflammation in acne outbreaks.
Using this fact as a starting point, the team tested a set of monoclonal antibodies against the CAMP factor, with promising results. The antibodies have proven effective against the inflammation-inducing properties of the toxin in both a mouse model and skin cells collected from humans.
Commenting on the study, Emmanuel Contassot, of the University of Zurich in Switzerland, had this to say:
“…such vaccines would address an unmet medical need….at the same time, acne immunotherapies that target P. acnes-derived factors have to be cautiously designed to avoid unwanted disturbance of the microbiome that guarantees skin homeostasis.
…whether or not CAMP factor-targeted vaccines will impact multiple P. acnes subtypes and other commensals has to be determined, but acne immunotherapy presents and interesting avenue to explore nonetheless.”
In terms of next steps, Huang and his team are steadily working toward large-scale clinical trials and FDA approval. They admit that they’re quite some distance from that point, but the early results are beyond encouraging.
The day may soon come when we have a safe, effective treatment for acne that does not have any of the side effects current treatment options are saddled with. Something that works for the overwhelming majority of the millions of acne suffers in the United States and around the world.
Again, this isn’t nearly as significant as cancer research or any number of other things, but it is still very good news and a most welcome development. Bravo to Professor Huang and his team.
Researchers at UCLA have made a major breakthrough that has profound implications for anyone suffering from an autoimmune deficiency disorder. They discovered a way to create synthetic T-cells which are pivotal to the body’s autoimmune response system.
The quest to create artificial T-Cells has been a long one, and fraught with a great many difficulties, because these cells have a number of unique qualities, not the least of which is their ability to deform, changing their shape to squeeze through tight spaces (shrinking to about one quarter of their normal size in some cases) so they can get to the site of an infection, anywhere in the body, then growing to as large as three times their normal size to overwhelm a would-be invader.
Once at the site of an infection, they release a variety of regulatory or inflammatory signals.
These complexities have made it impossible to replicate a functioning T-cell, until now.
The research team found success by using a microfluidic system, combining an alginate biopolymer with mineral oil, which merged to form a structure quite similar to the T-cells found in the body. Unfortunately, this was only half the battle. While the shape and behavior of the synthetic cells was accurate, the team needed a way to chemically encode their creations such that a human host would recognize them as T-cells and not reject them as just another form of infection.
To get around this problem, the team coated the synthetic cells with phosopholipids, which gave their creations a synthetic cellular membrane that closely mimics the membrane found in human T-cells. To achieve the needed elasticity, the group bathed their synthetic cells in a calcium ion bath, changing the concentration of the calcium ions to either shrink or grow the T-cells as needed.
That accomplished, they then they linked their newly made cells with CD4 signalers via bioconjugation. CD4 signalers are the mechanism by which T-cells are activated to attack various forms of infection or cancerous cells.
Their experiments were wildly successful, and in laboratory tests, the synthetic T-cells were able to behave in ways that closely mimicked their naturally occurring counterparts, and successfully fended off infectious agents.
Of course, we’re still quite some distance from human trials, but these early experiments are beyond promising. What’s more, the research team said that the same basic process could be used to create other types of artificial cells, including microphages that could target specific diseases.
The implications here are enormous. Untold millions of people suffer from a variety of conditions that weaken the body’s immune system, leaving it increasingly incapable of warding off infections and diseases that a healthy host could easily fend off. This, as often as not, is what proves to be fatal to people afflicted with such conditions.
If the UCLA team has anything to say about it, the day is coming when scientists will have access to a database that houses the “recipes” for a growing collection of artificial cells and will be able to custom-design treatments that are vastly more effective than anything available today. Great news indeed.
An interesting bit of research out of Ohio State University that seeks to solve one of the most intractable problems in modern medicine. Needles. Nobody likes them, but they’re an essential part of what healthcare practitioners do.
It’s amazing how big an impact getting stuck by a needle can have. If you work in the field, you’ve no doubt seen this firsthand. An otherwise tough as nails patient cringes, winces, hisses, mutters profanity, and is sometimes reduced to tears the moment you stick him (or her) with a needle.
The development of a pain free needle would be a game changer in the medical world, but how on earth would one go about it? Researchers at Ohio State think the answer lies in studying the humble mosquito.
After all, when a mosquito bites you, what it’s actually doing is inserting a needle-like probe into your skin so he can feed. Often, he’ll feed on you for several minutes before you’re even aware of it, so there’s certainly something to the idea.
Bharat Bhushan, a Professor of Mechanical Engineering at Ohio State, and member of the research team summed it up this way:
“Mosquitoes must be doing something right if they can pierce our skin and draw blood without causing pain. We can use what we have learned from mosquitoes as a starting point to create a better microneedle…right now, needles are very simple. There hasn’t been much innovation and we think there’s a way to try something different.”
Bhushan has built his career around studying nature and using his findings to improve products, so in that respect, his focus on mosquitoes comes as no surprise. The research was interesting, mostly involving diving deep into previous studies conducted by entomologists, but the team had a very specific focus.
Using their extensive backgrounds in engineering, they broke down the biomechanical parts of the mosquito that contributed to painless mosquito bites.
One of their key areas of focus was the outer covering of the mosquitoes proboscis, called the labrum. Their goal was to measure how hard and stiff it was in several places so they could use it as a guide for potentially creating needles based on the same design. Of interest, they discovered that the labrum was softest near its tip and edges, becoming harder and stiffer farther up (and in) the labrum.
Another discovery of interest was the fact that the part of the proboscis responsible for the intake of blood (called the fascicle) actually more closely resembles a saw than a needle. While that sounds like it should be more painful, not less, in reality the serrated edges make insertion easier, especially given that the mosquito vibrates it as it’s being inserted, which helps to reduce the force needed to pierce the skin.
Finally, another critical factor to pain free mosquito bites involves their use of a numbing agent. When they bite, they release saliva into the wound. Mosquito saliva contains a protein that lessens pain. Put those elements together and you get a pain free “needle.”
Translating that into an artificial needle, Bhushan envisions a microneedle that contains two even smaller needles inside. One would “hit first,” and immediately inject a numbing agent. The second needle would have a serrated design and be softer at the tip and edges. Like the mosquito’s fascicle, it would vibrate as it’s being inserted.
Obviously, such a needle would be more expensive than the needles in use today, but patients would almost certainly be willing to pay a premium for a pain free experience.
Bhushan summarizes: “We have the materials and knowledge to create a microneedle like this. The next step is to find the funding support to create and test such a device.”
The sooner that happens, the better.