#PhysicsPlus: Physics based Imaging Techniques for Cancer Detection and Treatment

Physics, while continuing its journey to unravel the mystery of nature since ages, has become one of the most powerful enablers of innovation and discovery through remarkable advancement in the field of academia and industry. Much technological advancement have taken place through curiosity driven research in fundamental areas of Physics, leading to tremendous social and economic benefits and thereby carrying a deep impact on our daily lives. One such significant contribution of Physics is towards the flourishing advancement of Medical Science.

Cancer, arguably one of the biggest curse of human civilization, kills millions of people all over the world every year. Only in US, 1.8 million cases are estimated as newly diagnosed in 2020. This year, as per the report from World Health Organization (WHO), apart from COVID-19 and Pulmonary diseases, Cancer is still one of the biggest killer of human lives.

Cancer indicates a disease caused by uncontrolled cell division. The rapid cell division usually produces a tumor, which then spreads and destroys the surrounding tissues. Research on modern Physics plays a crucial role in improving both diagnosis techniques and treatment of Cancer by developing various imaging techniques, which allows us to have a vision beyond the obvious. Innovative methods based on different types of radiation allow the medical physicists not only to detect cancerous tissues but also to kill cancer cells in a controlled and safe manner. Understanding, generating and manipulating that radiation has been made possible by research in the fundamental areas of Physics, like Cosmology, Astrophysics, Nuclear physics, Particle physics etc. Many of the recent advances in cancer therapy are the outcome of the development of novel radiation detectors (which act as imaging cameras) and other instruments originally designed for physics research, combined with advanced software.

Different Imaging Methods used to detect Cancer

  • Computed Tomography (CT) Scan:

In 1963, physicist Allan Cormack first devises a methodology for computed tomography scanning using rotating X rays. In 1971, Godfrey Hounsfield at EMI Laboratory developed the scanner to reconstruct internal anatomy from multiple X-ray images taken around the body. Cormack and Hounsfield jointly were awarded the Nobel prize in 1979 for their unmatched contribution in the field of Medicine.

  • Positron Emission Tomography (PET) Scan:

Another well known imaging technique is Positron Emission Tomography (PET) Scan. The first PET image was taken by biophysicist Michael Phelps and nuclear chemist Edward Hoffman in 1973.

This relies on a particular type of radioactive decay in which positrons, the anti-particle of electrons are being emitted. A positron emitting isotope is attached to a bioactive material and then it is injected into the body and finally gets accumulated in the targeted cells. The positrons emitted by the isotope get annihilated with the nearby electrons, which are commonly known as Pair Annihilation process. As mass is being destroyed so following the famous relation , energy is created and it is emitted in the form of photons (i.e., light), which is then detected by the imaging cameras and used to construct a three-dimensional image.

  • Single Photon Emission Computed Tomography (SPECT) Scan:

Single photon emission computed tomography (SPECT) scanning is a non-invasive nuclear imaging technique, frequently used in Oncology to examine the blood flow to tissues and organs. In this method, radioactive tracers are injected into the blood to generate images of blood flow to major organs, like brain and heart. The tracers generate gamma-rays, which are detected by a gamma camera, thereby creating 3D images with the help of computerized methods. In this case the tracer remains in the blood stream rather than being absorbed by the surrounding tissues, and thus limiting the images to areas where the blood flows.

  • Magnetic Resonance Imaging (MRI):

In 1980, another breakthrough imaging techniques was reported, which was called Magnetic Resonance Imaging (MRI), which relies on the fact that a magnetic moment aligns itself in the direction of the externally applied magnetic field.

 

When a human body is placed in a strong magnetic field, the free hydrogen nuclei associated with the water molecules present in the body align themselves along that field direction. Usually a uniform magnetic field of strength ∼ 1-3 Tesla is applied in MRI. Next, a radio-frequency pulse is applied in perpendicular direction to the static magnetic field which tilts the magnetic moments associated with the water molecules away from the magnetic field direction. Then when the RF signal is withdrawn, the magnetic moments realign themselves in the initial direction and thereby relaxing through radiating a radio frequency signal of their own which is again detected by a conductive field coil placed around the body. The detected signal is then reconstructed to obtain 3D grey-scale images. 

MRI technique offer much better resolution of soft tissue images compared to X-ray images. The relaxation time, as well as radiated energy from magnetic moments, depend on the environment and the chemical nature of the molecules. Physicians can differentiate between various types of tissues based on these magnetic properties. The faster the protons realign, the brighter the image. In 2003, Peter Mansfield, from University of Nottingham got the Nobel Prize in Medicine, shared with Paul Lauterbur from University of Illinois, for their “discoveries concerning magnetic resonance imaging”.   

  • Optical Coherence Tomography (OCT):

Optical coherence tomography (OCT) is a form of “optical ultrasound” where, Infra-red laser light is made to fall on the skin and the light that is reflected back from the tissue layers just beneath the surface can be collected to form a very high resolution image. Although OCT images contain more detailed information than MRI, but it can only penetrate a few millimeter, which makes it useful for detecting cancer of the skin and esophagus mostly, for example.

  • Selected Ion Flow Tube mass spectrometry (SIFT):

Selected ion flow tube mass spectrometry is a technique, developed by a group of astrophysicists at the University of Birmingham while investigating the chemistry of interstellar clouds. The technique allows to sense tiny amounts of gas, and can be used to detect certain cancers by analyzing a sample of a patient’s breath.

Physicists are also working to develop imaging techniques with low energy non-ionizing THz radiation which can penetrate few millimeters of human tissues with low water content, and thereby making it an ideal candidate for probing breast and skin cancer at a very early stage. The first THz cameras were developed by astrophysicists to image the distant universe.

Treatments of Cancer using the concepts of Physics

  • Radiotherapy:

Radiotherapy is of the most effective treatment to treat Cancer. In this technique, a high energy radiation which includes not only X rays but also high energy particle beams like electrons, protons, etc., is selectively deposited in the cancerous cells and if the dose is right, the radiation can kill the cell, by breaking the DNA strands in the cell nuclei. Modern proton beam therapy is the most precise form of radiation treatment available today. It destroys a the tumor selectively, leaving the surrounding healthy tissues and organs almost completely unaffected.

  • Brachytherapy 

Brachytherapy, another well known treatment of cancer patients, where artificially produced radioactive “seeds” are enclosed inside protective capsules, and then delivered to the tumor, where they emit beta or gamma rays, to provide a highly localized dose.

  • Boron neutron capture therapy 

Boron neutron capture therapy is used to treat cancers of mostly head and neck. A non-radioactive form of boron is injected into the cancer cells and then a beam of neutrons is incident on it. Boron absorbs the neutrons much more readily than human tissue, and by doing so it forms lithium ions and high-energy alpha particles, which together deliver the correct radiation dose to the tumor.

With all these therapies people are trying to decrease the death toll caused by this deadly disease. Today with all these diseases (Covid19, Cancer etc.) around us, let us question ourselves, are we advanced enough to protect ourselves? Or our vulnerabilities persist helplessly in front of nature? Anyway, the journey of Science will continue in an unstoppable manner to fight against all odds for the betterment of human kind and a deeper understanding of nature.

 

#EnergyNext: Thermoelectricity at molecular junctions, breakthrough in Nanoscience!

Why Thermoelectricity?

Fig 1. : Schematic illustration thermoelectric effects at molecular junction

(Figure taken from https://doi.org/10.1002/adfm.201904534)

It has been reported in recent studies that almost two third of the energy generated by conventional power stations is lost as waste heat. What if we can reuse the wasted heat and convert it into usable electricity? Wouldn’t that be great? Here comes the idea of Thermoelectricity. During the past few decades as a measure against global warming, significance of recovering waste heat and converting it to electrical energy has been re-recognized as a major challenge to both science and technology while addressing the global energy crisis. Though there are various methods to recycle waste heat, much attention is being paid to ‘Thermo-Electric (TE) Energy conversion’ because of the ‘Green’ nature of conversion process (i.e., harvesting power from waste heat) and easy device maintenance due to absence of moving mechanical parts, making it technologically intriguing. Thermoelectric devices can convert heat energy directly into usable electrical energy without having any heavy moving parts like turbines, just by exploiting a temperature gradient across them.

Seebeck and Peltier Effects

Fig.2: Schematic Illustration of Seebeck effect where thermal gradient induces a voltage drop at the two junctions.

First thermoelectric effect, also known as Seebeck effect, was discovered way back in 1822 by Seebeck, where a voltage drop is induced across the junctions of two different metals kept at different temperatures. Thus, heat energy is being converted into electrical energy in this process. If  is the amount of heat generated due to  temperature difference, the Seebeck co-efficient is defined by the their ratio, i.e.,S=

Twelve years later in 1834, Peltier observed that the temperature changes at the junctions of two different metals when an electric current is passed through it. If  amount of heat is generated or absorbed due to current  flowing through a metallic branch, then these two factors are related as , where ᴨ being the Peltier co-efficient.

Fig.3.: Schematic illustration of Peltier effect where thermal gradient is being developed due to current flowing through a circuit.

Soon after this discovery in 1838 Lenz performed a simple experiment which had a great impact on technological applications of thermo-electric effect. He put a water droplet at each of the two junctions of a bismuth-antimony closed loop and passed an electric current through the system. One water droplet then freezes into ice while the other remains in the form of water. By reversing the direction of the current flowing through the loop, the ice at one junction melts into water and the water droplet at the other junction freezes into ice. This experiment indicated that the thermo-electric effect can be used for both power generation and refrigeration.

Fig.4 : Schematic diagram of a thermo-electric power generator.

Now, if a thermoelectric material is connected to two different heat baths maintaining their temperature at Th (temperature of the hot junction) and Tc (temperature of the cold junction), then from thermodynamic argument it can be shown that the efficiency of thermo-electric power generation can be shown to be dependent on a quantity named Figure of Merit (ZT) . T is the equilibrium temperature and defined as T = (Tc + Th ) / 2.

 What is Figure of Merit?

The parameter ZT (a dimensionless quantity) is called the Figure of Merit (FOM), which plays the crucial role in determining the quality of the thermoelectric material used. If the value of ZT is increased, the thermoelectric efficiency approaches the ideal Carnot cycle efficiency. It is defined as, ZT = GS2T/k. Here,  and k are the electrical and thermal conductances of the material respectively.  being the Seebeck co-efficient or thermopower of the thermoelectric material used. The thermal conductance includes both electronic and phononic contributions.

Evolution since inception:

In the beginning metals were obviously the first choice before the widespread usage of semi-conductors for applying these ideas in technology. But for metals the value of  was found to be much less than unity for all temperatures. So, then the attention got shifted to bulk semiconductors.

Lots of thermo-electric materials have been studied till date (including Bi2Te3, Mn-Si, Bi-Sb-Te-Se, etc.). But in most cases  at room temperature. For bulk samples it has not been possible to increase  much, which was also the theoretical prediction. The fact is that  can be increased for a bulk thermo-electric materials by changing the carrier concentration by doping, but that still has a limitation as G and S changes in opposite senses as carrier concentration gets changed in case of bulk materials. Therefore, to increase  low thermal conductance and at the same time enhanced and G is preferred. To obtain low thermal conductance quantum confinement is an way out. So, people started to think about using quantum wires and molecular junctions which are even smaller in sizes than regular quantum wires as prospective thermoelectric materials.

Molecules as efficient functional Thermoelectric elements:

While discussing the concept of thermo power in nano-scale junctions the main focus is given on organic molecules or self-assembled mono layers trapped between two macroscopic electrodes, but one can also consider a quantum dot (QD), nanotube, different non-trivial topological systems etc., in place of the molecular systems. Such low-dimensional nano-structured systems can have crucial significance as a room-temperature thermoelectric.

The availability, low-dimensionality and low thermal conductivity make the molecular systems a natural choice for next generation thermoelectric materials with enhanced . As molecules are automatically nano-structured and their electrical conductance ( ) and Seebeck co-efficient  can be tuned externally by applying a gate voltage or with other parameters, so they can be chosen as potential candidates for efficient and high power thermo-electric devices.

Very few experiments have been performed so far on molecular junctions for measuring thermopower. Among them in 2007, an article in Science told that the Seebeck co-efficient and the conductance can be increased simultaneously by applying a gate voltage i.e., by shifting equilibrium Fermi level  closer to the resonance peak. It strongly suggests that much higher figure of merit can be achieved in molecular systems.

Thermoelectricity and DNA

Naturally, the next challenge is to find which molecule and/or the material junction leads to the best and experimentally achievable thermoelectric properties.

DNA, the basic building block of our genetic code may exhibit large potential for application in nanotechnology. Proliferation of DNA sequencing may have a deep impact on clinical medicine, health care and criminal research. Therefore attention is also being paid for non-invasive detection of nucleotides along DNA strands, apart from the conventional Sanger sequencing method. Measuring transverse tunnel currents through a single stranded DNA as it translocates through a nanopore has been proposed as a suitable physical method for single base resolution, and it has also been shown experimentally that the four bases provide a distinguishable transverse electronic feature when measured with a Scanning Tunneling Microscope (STM) which directly detects the molecular levels of single DNA bases.

In 2011, a remarkable experiment was reported again in Science, where spin selective transmission were studied through self-assembled monolayers of double stranded DNA which provided high degree of spin polarization. Immediately after that, it was theoretically explained that in presence of spin-orbit coupling, dephasing and helical symmetry this kind of topology is quite capable of producing spontaneous spin polarization even in a two-terminal system, which makes this discovery an absolute breakthrough as it encompasses several possibilities of designing higher  functional elements using artificially synthesized DNA molecules which is the future of molecular electronics.

Let me end with a Quote by Nikola Tesla “What One Man Calls God, Another Calls the Laws of Physics”. So the journey of a physicist is through an infinite path of truth, to be one step closer to nature.

Spintronics at Nano scale: A paradigm shift in future of Electronics

What is Spintronics?

(Schematic illustration of spin transport through a nano-scale system Figure taken from https://physics.aps.org/articles/v4/28)

Spintronics” (named by S. Wolf in 2001), a sub-discipline of Condensed Matter Physics, has emerged out as a promising field in the last two decades that aims to manipulate the spin degree of freedom of an electron to explore a lot more rich and intriguing physics. Until few years back, when we talked about the mainstream ‘Electronics’, it was ‘charge’ based, whereas in Spintronics, it is not the charge, rather the ‘spin’ of the electron, that carries the information. The possibility of manipulating electron ‘spin’ was initiated by the discovery of Giant Magnetoresistance (GMR) effect independently by Peter Grünberg and Albert Fert in 1988, which awarded them the Noble prize in 2007.

Giant Magnetoresistance (GMR) Effect:

GMR is the drastic change in electrical resistance of a multilayer formed by alternating magnetic and non-magnetic metallic ones, when an external magnetic field is applied. In the absence of any external magnetic field, the exchange coupling between the magnetic layers through the non-magnetic one aligns the magnetization vector anti-parallel to each other. Then when a magnetic field, strong enough to overcome the anti-ferromagnetic coupling is applied, all the magnetization vectors align themselves along the field direction. This new parallel configuration exhibits an electrical resistance much smaller than the anti-parallel one. This dramatic change in electrical resistance is commonly known as the GMR effect. The discovery of GMR has far reaching consequences as not only the interplay between transport and magnetism was demonstrated here, but it also established the fact that the longly ‘neglected’ spin degree of freedom can play a crucial role in transport phenomena and hence in electronics. 

(Schematic illustration of GMR Effect. Figure taken from https://simple.wikipedia.org/wiki/Giant_magnetoresistance)

Application and Possibilities:

Today GMR based magnetic data storage devices are in every computer which has already created a multi-billion industry. GMR effect provides a key idea to device a magnetic field sensor. As the magnetic field can change the orientation of the magnetic moments in one layer of the structure, it disrupts the relative orientation between the layers and hence changes the electrical resistance.

Another significant application is Spin Dependent Tunneling (STD) Device which is almost similar like GMR setup, the only change being here the non-magnetic metallic layer is substituted by an insulator. In this case, the resistance difference of the two configurations (parallel and anti-parallel) is quite large compared to the GMR setup. The key concept is to encode the ‘low resistance’ state as ‘1’ and the ‘high resistance state’ as ‘0’, and it is used in Magnetoresistive Random Access Memory (MRAM).

One of the most notable advances in the area of spin based device making is the experimental discovery of Spin Momentum Transfer (SMT) effect in 2002, in which the spin polarized current can exert a torque on the magnetization of a magnetic film. SMT has the potential to offer orders of magnitude smaller switching current and consequently much small energy is needed to write per bit. In contrast to the ordinary MRAM, SMT-MRAM provides advantages in speed, energy and endurance. There are various other significant discoveries in spin based device making (like Spin FET, Spin RTD, etc.) which makes this field a promising as well as a challenging one.

Successful spin based device making involves three pertinent requirements

(i) Spin injection

(ii) Spin transport with long spin coherence length

(iii) Spin detection.

Recent efforts in designing and manufacturing spintronic devices involve two different approaches.

  • The first one is improving the existing GMR-based technology by either developing new materials with larger spin polarization of electrons or making perfections in the existing devices that behave as an efficient spin filter.
  • The second approach focuses on finding novel ways of generation and utilization of spin-polarized currents. These include investigation of spin transport in semiconductors and looking for ways in which semiconductors can function as spin polarizer and spin valves.                                 

Spintronics at Nano-scale: 

Spintronic devices offer several advantages over the conventional bulk semiconductors such as

Non-volatility (i.e., the information is retained even after being switched off)

  • Increased Data Processing
  • Decreased Electrical Power Consumption
  • Increased Integration Density

After the discovery of GMR effect remarkable advancement has taken place in case of data processing, device making technologies and quantum computation. The ultimate target is to reach beyond ordinary binary logic using ‘qubits’ and ‘spin entanglement’ for new quantum computing strategies, which in turn requires measured control of spin dynamics even on a single spin scale. It demanded merging of two rapidly evolving fields, ‘Spintronics’ and ‘Molecular Electronics’. So, in short a deeper understanding is needed in spin dynamics and spin transport at atomic and molecular length scales.

Several path breaking research findings are there like discovery of Spin-FET, Spin Hall Effect (SHE), Spin polarization in Double stranded DNA, Non-volatile Spin Logic gates, Increasing spin coherence length considerably by applying sound waves etc., to name a few. For example, in Datta-Das Spin-FET, Spin-Orbit coupling (Rashba type) was used to modulate the spin orientation of the electrons in the conduction band of a 2DEG. Rashba coupling strength can be tuned externally by applying gate voltage and it provides an advantage over using ferromagnetic leads as the later case generates high mismatch in resistivity. Or for that matter, in SHE, longitudinal flow of unpolarized charge current through a sample with SO coupling induced non-equilibrium spin accumulation at the lateral edges of the sample and thus a pure Spin Current got established. Here, just like Classical Hall effect, due to Spin-Orbit interaction, a spin dependent transverse force acts on up and down spin electrons and deflects them in opposite directions.

Conclusion:

In short, the subject ‘Spintronics’ undoubtedly brings new challenges to the researchers across the globe. We all are hopeful that the newly developing spin based electronic devices will replace soon the traditional ones. Researchers anticipate that establishing the principles of inter-relation for different disciplines of Physics like Electromagnetism, Heat, and even Mechanics may provide an opportunity to extend Spintronics remarkably, which may direct us to a new paradigm of Physics and its applications.

Quantum Computers: Reality or Myth? A New Name!

“Quantum Computers” – a word that was unheard to most of us even two decades ago has now stimulated the scientific community with a hope to provide breakthroughs in Science, Medicine and Machine Learning etc. From breaking encryption to revolutionizing artificial intelligence, quantum computers claim to outperform the traditional ones in terms of efficiency sometimes at an exponential scale. They are the strongest possible computational device offered to us by nature. Big shot companies like Google, IBM and Microsoft are spending billions and billions of dollars to create a reliable quantum computing facility. In October, 2019, Google claimed to have achieved “Quantum Supremacy” just by using an array of 54 qubits (out of which 53 were functional) to perform a series of operations which they claimed would have taken 10,000 years by the best supercomputer of the world!! Reportedly IBM objected to their claim saying it could take a few days only by a supercomputer. But who cares! Let the titans clash. Anyway, achieving quantum supremacy does not indicate that quantum computers are ready to perform right now. It’s a long way to go, but at least the road map is known to us.

Why Quantum Computers?

The computers that we use in our daily lives (like PCs, Laptops, Tablets, Smart-phones or even HPCs) are made up of chips, and those chips use bits as a fundamental unit to store and manipulate information. Bits are nothing but like tiny switches which can either be “ON” (representing a state) or “OFF” (another state), based on classical logic gate operations which manipulates the property “charge” of an electron and those operation rely on the very fundamental principle of classical physics that says at a given time, a physical system can stay in one and only one state. But universe doesn’t always work like that. Due to rapid progress in miniaturization, now chips are having typical dimensions around few square millimeters or even less and with millions of transistors in it. So, quantum effects are there and chip-makers tend to go to great lengths to suppress these effects. So, if we could rather manage ourselves to work with those effects then further miniaturization is possible providing a paradigm shift in computational performances. Not only is that, quantum computational algorithms are capable of handling ‘uncertainties’ and thus far better candidates to simulate biological and chemical complex systems.

How do they work?

In Quantum computers, qubits (  or ) are used instead of bits which are based on another property of electron named ‘spin’. According to Quantum mechanics, the state of a system can be a linear, coherent superposition of many different states, and thus capable to produce interference. Most magically, even two spatially separated states can be entangled to each other and therefore the operations on them have a non-local effect. Based on those properties qubits can be in many states (unlike “ON” and “OFF” for classical cases). For example, an n qubit system can be in 2n states while allowing entanglement between a n qubit state and m qubit states the number of possibilities enhances to 2m+n rather than 2n + 2m possibilities.

Analogous to the classical logic gate operations (like AND, OR and NOT), in quantum computation, the identity matrix and the Pauli matrices provide the 1 qubit logical operations.  For example, the bitflip gate X turns  to and vice-versa. The phaseflip gate Z puts a ‘-‘ sign in front of the qubit . Symbolically they are expressed as,  and  Similarly other operations are performed by sequential unitary operations.

Past, Present and Future…

This mind boggling concept was first coined by stalwarts like Yuri Mamin (1980), Richard Feynman (1981) and Paul Benioff. Later first designing of  quantum Turing machine in 1985 by David Deutsch opened the gateway for unimaginable possibilities. Worldwide many groups are working now towards the success of quantum computation.

Till date two technologies are officially reported for creating qubits. One is by trapping ions and another is by using miniature superconducting circuits. IBM uses the later one. Problem is for doing anything with the qubits, coherence must be preserved as they are extremely sensitive to environmental noise.

In the attached figure (source: https://www.ibm.com/quantum-computing) you can see the dilution refrigerator made by IBM with nearly 2000 components for creating a super cold ambience so that the coherence of the qubits are being maintained.

Figure: Dilution refrigerator made by IBM for maintaining coherence for qubits

Till date the highest no of qubits are created by Google as reported in 2019 (an array of 54 qubits).

The future is unknown but we are hopeful for the unending offerings that quantum computers provide. It’s not like that we are going to have a quantum chip in our PC or smart-phone, and there will be no iphone Q, but for research and business purpose the possibilities are huge. From modeling of highly complicated chemical reactions, predicting early diagnosis of Alzheimer’s, to anticipating financial market, quantum algorithms claim to offer an unimaginable success. Cryptography can have an incredible makeover once quantum computation becomes successful. Rumors are there that various intelligence agencies over the world are stockpiling encrypted data with a hope that it can be decoded soon with the success of quantum computers. The only way to get saved from this is also within the realm of Quantum mechanics i.e., Quantum Encryption!! Think of a key that can’t be copied or hacked!! Quantum encrypted keys are exactly that.

It may sound a magical world, but it will happen in near future. Every breakthrough in science has passed through a dilemma and confusion. That’s the essence of Science. To unravel nature, to solve puzzle where the answer is embedded in it.

Such mysteries and possibilities make Physics as challenging and appealing as ever!!

Skip to content