Breakthrough in Blindness

Dr. Jeffrey Y H Chung numbRecently, there has been a major scientific breakthrough by a group of scientists in Montreal, who have discovered that a protein found in the retina plays an essential role in the function and survival of light-sensing cells that are necessary for vision.  Such findings could have a huge impact on our understanding of retinal degenerative diseases that cause blindness.  The researchers studied compartmentalization, a process which establishes and maintains different compartments within a cell, each of which contains a specific set of proteins.  Such a process is essential for neurons to function properly.

The compartments within a cell are like the different parts of a car; in the same way that gas must be in the fuel tank to power the car’s engine, proteins need to be in a specific compartment to properly exercise their functions.  One good example of compartmentalization can be observed in a specialized type of light-sensing neurons in the retina, the photoreceptors, which are made up of different compartments containing specific proteins essential for vision.  The researchers wanted to understand how compartmentalization is achieved within photoreceptor cells, and their work seems to have identified a new mechanism that would explain this process.  More specifically, the researchers found that a specific protein known as Numb functions like a traffic controller to direct proteins to the appropriate compartments.

The researchers demonstrated that without Numb, photoreceptors are unable to send a molecule essential for vision to the correct compartment, which causes the cells to progressively degenerate and die.  The death of photoreceptor cells is known to cause retinal degenerative diseases in humans that ultimately lead to blindness, and the researchers’ work therefore helps to provide a new piece of the puzzle to help scientists better understand how and why cells die.  Such results could lead to a substantial impact on the development of treatments for retinal degenerative diseases, such as Leber’s congenital amaurosis and retinitis pigmentosa, by providing novel drug targets to prevent photoreceptor degeneration.

 

Ebola Vaccine?

Dr. Jeffrey Y H Chung EbolaIn response to the Ebola outbreak that has occurred across West Africa, scientists around the world are scrambling to develop a vaccine against the disease.  Now, a study by researchers from the National Institutes of Health reveals the creation of a vaccine that has generated long-term immunity against the Ebola virus in monkeys, and the vaccine is currently entering its first phase in clinical trials on humans.  If the current Ebola outbreak is any indicator, the fatality rates during Ebola outbreaks are high indeed.  As of August 31st this year, the World Health Organization (WHO) states that there have been an estimated 3,685 cases of Ebola and 1,841 deaths from the disease in West Africa, with numbers only getting higher.

There is currently no vaccine for Ebola, although scientists have been working to develop one for quite some time now.  An international research team has rapidly sequenced 99 Ebola virus genomes in an attempt to better understand the virus and find ways to potentially contain it.  According to recent report in the Annals of Internal Medicine, an experimental drug called ZMapp appeared to treat Ebola in two men who contracted the infection in Liberia.  ZMapp, which has only been tested in monkeys, has yet to be approved for public use by the US FDA.

In a recent study, the Vaccine Research Center at the National Institute of Allergy and Infectious Diseases (NIAID) claim to have developed a vaccine based on the chimpanzee-derived adenovirus vector, a chimpanzee “cold” virus.  Previous efforts to create an Ebola vaccine have included the use of human adenoviruses, but the researchers explain that since a lot of humans have previously been exposed to these, their immune systems are set up to neutralize them.  As such, the researchers decided to use chimpanzee adenoviruses that the human immune systems haven’t come across, testing them on macaque monkeys.  They discovered that inoculation with the cold virus alone provided short-term and limited long-term protection against EBOV, one of the most common and deadly forms of the Ebola virus, which is responsible for the majority of the latest Ebola outbreak.  The researchers found that being injected with the chimpanzee cold virus protected the monkeys against EBOV for 10 months.  The vaccine appeared to increase the number of T cells in the monkeys, which in turn defended the immune system against the virus.

Even though this study assessed the vaccine’s ability to protect against EBOV, the team believes that it could also serve to protect against another common form of the virus, SUDV.  As a result of these findings, the National Institutes of Health recently announced that the vaccine will enter phase 1 clinical trials to test its effectiveness against EBOV and SUDV in humans.

New Method of Medicine?

Dr. Jeffrey Y H Chung G Protein

The activation cycle of a G-protein (purple) by a G-protein-coupled receptor (light blue) receiving a ligand (red)

Roughly 40% of medicines used today work through so-called “G protein-coupled receptors”, which react to changes in the cell environment, for instance, to increased amounts of chemicals like cannabis, adrenaline or the medications we take and are therefore of paramount importance to the pharmaceutical industry.  These receptors have a key role in recognizing and binding different substances.  However, a new nanotechnology method could help and improve the development of new medicine and reduce costs.  This method has been described in a publication at the esteemed scientific journal “Nature Methods”.

This new method will dramatically reduce the use of precious membrane protein samples.  Medicinal substances are traditionally tested by using small drops of a sample containing the protein that the medicine then binds to.  However, if you look closely enough, each drop is composed of thousands of billions of small nano-containers containing isolated proteins.  It has been assumed until now that all of these nano-containers are identical.  However, it turns out that this is not actually the case, which is why researchers can use a billion times smaller samples for testing drug candidates.  The researchers have discovered that each one of the countless nano-containers is unique.  This new method would allow scientists to collect information about each individual nano-container, which can then be used to construct high-throughput screens where you can do such things as test how medicinal drugs bind G protein-coupled receptors.

 

Corrective Lens for Screen

Vision correcting displayImagine if computer screens, rather than the people using them, had glasses.  This might sound crazy, but thanks to technology being developed by UC Berkeley computer and vision scientists, it might not be too far away.  The researchers are currently in the process of developing computer algorithms to make up for somebody with visual impairment, creating vision-correcting displays that help users see text and images clearly without wearing eyeglasses or contact lenses.  This technology could potentially help the countless people currently in need of corrective lenses to use their smartphones, tablets and computers.  One common problem this technology could help is presbyopia, a type of farsightedness in which the ability to focus on nearby objects is diminished over time as the eyes’ lenses lose elasticity.

More importantly, these displays could eventually aid people with more complex visual problems, which can’t be corrected by eyeglasses.  In this day and age, displays are everywhere, and interacting with displays is often taken for granted.  People with higher order aberrations tend to have irregularities in the shape of the cornea, making it difficult to have a contact lense that will fit.  In some instances, this could be a barrier to holding certain jobs.  Such research could dramatically improve the lives of such people.  UC Berkeley researchers teamed up with colleagues from MIT to develop the latest prototype of a vision-correcting display.  The setup adds a printed pinhole screen sandwiched between two layers of clear plastic to an iPod display to enhance image sharpness.  These tiny pinholes are 75 micrometers each and spaced about 390 micrometers apart.  The research team will be presenting this computational light field display on August 12 at the International Conference and Exhibition on Computer Graphics and Interactive Techniques, or SIGGRAPH, in Vancouver.

What makes this project so interesting is that it uses computation instead of optics to correct vision.  The algorithm that they use works by adjusting the intensity of each direction of light that emanates from a single pixel in an image based on a user’s specific visual impairment.  In a process known as deconvolution, the light passes through the pinhole array in such a way that the user will see a sharper image.  In the experiment, researchers displayed images that appeared blurred to a camera, which was set to simulate a farsighted person.  When using the new prototype display, the blurred images appeared sharp through the camera lens.  This approach improves upon earlier versions of vision-correcting displays, which resulted in low-contrast images.  The new display combines light field display optics with novel algorithms.  This research prototype could easily be developed into a thin screen protector, and continued improvements in eye-tracking technology could make it easier for displays to adapt to the position of the user’s head position.

The Role of Violence in Evolution

From evolutionary biology comes a new study that suggests the shape of our face is actually a result of violence in our prehistoric ancestors.  Researchers from the latest study say that human faces evolved to minimize the impact of injury from punches to the face.  The team, led by biologist David Carrier and physician Michael H Morgan, focused on our australopith ancestors.  These are human-like primates who lives in Africa between 6 and 1.2 million years ago, and strongly resemble the modern-day chimpanzee.  Australopiths had a mix of both human and ape traits, walked on two legs, had small brains and small canine teeth, although their cheek teeth were quite large.

Australopith

An artist’s renditioning of the Australopith.

The most widely-known example of an australopith is “Lucy”, a well-preserved fossilized skeleton from Ethiopia that dates to about 1.2 million years ago.  Previously, the most prominent hypothesis held that the evolution of our ancestors’ faces was a need to chew foods that were difficult to crush, such as nuts.  However, Carrier says that australopiths had traits that could have boosted fighting ability, such as hand proportions that allowed them to make a fist, which is effective for striking others.  If indeed the evolution of our hand proportions were associated with selection for fighting behavior, then you might expect the primary target, the face, to have undergone evolution to better protect it from injury.  Even nowadays, when humans fight with their hands, they typically aim for the face.

What Carrier and his team found was that the bones that suffer the highest rates of fracture in fights are the same parts of the skull that exhibited the greatest increase in robusticity during the evolution of basal hominins.  He further explains that such bones are part of the skull that exhibits the biggest difference between both australopith and human men and women.  In other words, male and female faces are different because the parts of the skull that break in fights are bigger in males.  Most importantly, however, he notes that such facial features are found in the fossil record around the same time that our ancestors developed the hand proportions that allowed them to make a fist.  The observations suggest that many of the facial features that characterize early hominins may have evolved to protect the face from injury during fistfights.

The team added that their study adds to the continuing conversation about the role that violence played in our evolution, and their work suggests that violence played a much larger role in human evolution than many anthropologists accept.  Morgan believes that the science behind this is strong, and fills some longstanding gaps in the existing theories as to why the musculoskeletal structures of our faces developed the way that they did.  Such research into the evolution of our human ancestors is relevant, as it provides insight into how and why we evolved into what we are now.  Carrier says that the new research not only provides a different reason for the evolution of our faces, but also addresses the controversial question of how much a role violence played in the early days of our ancestors.  This question goes back to the French Enlightenment philosopher Rousseau, who argued that before civilization corrupted mankind, our ancestors were “noble savages”, an idea that remained strong in social sciences.

Fovea Breakthrough

Among humans, a tiny area in the center of the retina, called the fovea, is critically important to viewing fine details.  Densely packed with cone photoreceptor cells, it is typically used while reading, driving and gazing at objects of interest.  Certain animals have a similar feature in their eyes, but researchers had previously believed that the fovea was unique to primates.  According to vision scientists at the University of Pennsylvania, Foveadogs also have an area of their retina that strongly resembles the human fovea.  In addition, this region is susceptible to genetic blinding diseases in dogs, much as it is in humans.

It’s really amazing to believe that humans and dogs have been interacting with each other for some 20,000 years, and we’re now learning something about them that’s not only eluded us for all of this time, but is also clinically relevant.  It is known that dogs have what is called an “area centralis”, a region around the center of the retina with a relative increase in cone photoreceptor cell density.  However, dogs lack the pit formation that humans have and before this study was done, it was believed that the increase in cone photoreceptor cell density was nowhere near what is seen in primates.  The highest reported density in dogs was previously 29,000 cones per square millimeter, compared to more than 100,000 cones per square millimeter seen in the foveas of humans and macaques.

However, it turns out that previous studies in dogs had missed a miniscule region of increased cell density.  In this study, while examining the retina of a dog with a mutation that causes a disease akin to a form of X-linked retinal degeneration in humans, the Penn researchers noticed a thinning of the retinal layer that contains photoreceptor cells.  After focusing in on this region, researchers examined the retinas of normal dogs using advanced imaging techniques, such as confocal scanning laser ophthalmoscopy, optical coherence tomography and two-photon microscopy.  By enabling the scientists to visualize different retina layers, they were able to identify a small area of peak cone density and estimate cone numbers by counting the cells in this unique area.  Researchers found that cone densities reached more than 120,000 cells per square millimeter in a never-before-described fovea-like region within the area centralis, a density of par with a primate’s fovea.  They also recognized that the “output side” of this cone-dense region corresponded with an area of dense retinal ganglion cells, which transmit signals to the brain.

Human patients with macular degeneration experience a loss of photoreceptor cells at or near the fovea, which results in a loss of central vision.  To see whether the fovea-like region was similarly affected in dogs, the Penn researchers used the same techniques to examine animals that had mutations in two genes (BEST1 and RPGR) that can lead to macular degeneration in humans.  In both cases, the onset of disease affected the fovea-like region in dogs in a very similar way that the disease affected humans, with central retinal lesions appearing earlier than lesions in the peripheral retina.  Why the fovea is so susceptible to early disease expression for certain hereditary disorders and why it’s spared under other conditions is as of yet unclear.  However, these findings could allow for translational research by allowing researchers to test treatments for human foveal and macular degenerative diseases in dogs.

In addition, this discovery offers unique insight into a rare human condition, fovea plana, in which people have normal visual acuity but no “pit” in their fovea, so that their fovea resemble that of dogs.  The fact that dogs have a fovea-like area of dense photoreceptor cells could indicate that dogs are seeing more acutely than people had previously assumed.  Looking ahead, researchers may focus on this fovea-like area in studies of therapies not only for X-linked retinal degeneration and Best disease, but also other sight-related problems that affect the macula and fovea.

Star-Shaped Cataracts

I recently came across an article about a certain man, a 42 year-old electrician who was electrocuted while working.  After the electrocution, the man developed cataracts, which wasn’t surprising.  But what was surprising was that these cataracts was that their shape: star-shaped.  The short case study about this injury was documented by Bobby Korn and star shaped cataractsDon Kikkawa of the University of California, San Diego, and was published in the New England Journal of Medicine.  During the accident, the man was slammed with 14,000 volts in his left shoulder.  In addition to his other injuries, he developed these unique, star-shaped cataracts.  While cataract surgery would normally have been enough to restore his eyesight, he sustained quite a bit of damage to his retina and optic nerve, resulting in permanent blindness.

Cataracts are cloudiness that accumulates in the lens of the eye.  The now-opaque lens has lost its ability to properly focus light onto the back of the eye on the retina, which decreases vision.  This is usually treated with surgery, although medicated eye drops can also be used to treat or prevent mild cases of cataracts.  Although there is a long list of diseases and risk factors for cataracts, they can be formed from trauma and create a completely different type of cataract.  The star-shape indicates a “stellate cataract”.  These aren’t completely cloudy, and white fractal patterns usually serve as a boundary for the cataract.  Trauma has also been known to produce rosette cataracts, which take on a flowery or feathery appearance.  Such a type of irregular cataract can form right away, although in some cases it takes a couple of years before eyesight is impacted.  These cataracts are formed when the eye has been penetrated, experiences blunt force trauma or is injured indirectly, such as what happened with the electrician.  The shockwave causes injury in the cortex or capsule of the lens, which may be the reason that the cataracts don’t uniformly develop.

The Importance of Protein

Cilium

Cilium, as shown through a microscope.

According to a recent article, researchers from Penn State University and the University of California have just discovered how a protein is important for the growth of healthy cells in mammals.  In previous research, Aimin Liu, co-author of the study and associate professor of biology at Penn State University, discovered that protein C2 calcium-dependent domain containing 3 (C2cd3) is necessary for cilia to grow on cell surfaces.

Cilia are hair-like structures that are found on the surface of mammalian cells, and are responsible for transmitting and processing information in the body.  Without cilia, cells can’t sense what’s going on around them and can’t communicate.  Cilia also filters bacteria, preventing it from entering the body’s organs.  A lack of cilia can lead to serious health conditions, such as polycystic kidney disease, blindness and neurological disorders.  According to researchers, their finding about cilia has important implications for human health.

Back in 1962, Liu and his colleagues first learned that C2cd3 was important for cilium formation after they discovered that mice lacking the protein had serious developmental problems.  At the time, however, researchers didn’t understand why a lack of C2cd3 led to these developmental problems, a question which has been answered by this new study.

The researchers knew that cilia grows from centriole, a structure that attaches itself to the inner surface of the cell and acts as an anchor.  Before it can grow cilia, a cell needs to amass a set of appendages at one end of the centriole, which allows appendages to attach the centriole to the cell’s surface so that the cilium can grow.  However, the researchers had no idea how the appendages were assembled.

New research revealed that when there is no C2cd3 protein, appendages aren’t assembled at the end of the centriole, meaning that the centriole isn’t linked to the cell membrane, and therefore can’t take on other proteins that allow for cilia to grow.  This means, according to Liu, that protein is required for the very first step of putting a cilium together.  Researchers hope that their findings will lead the way to better knowledge about additional cilium development, as well as better treatments for cilia-related diseases.  Abnormal function of cilia can lead to numerous diseases, such as cystic disorders of the kidney and liver, and can even lead to blindness or deafness.