My dream in San Francisco 2

 


 

 

 

 

 

 

 

 

 

 

 

 

 

It will be really cool to work for City of San Francisco! If I get a chance for my job interview there, I will say that I have always thought San Francisco is the most visionary city in the world! It is 20 years ahead of any other place!

 

Before global warming and excess carbon footprint were taken seriously, SF has already built a comprehensive public transportation system, so people don’t even need a car to live there! That is not an easy task because the city planners not only had to design and build the system but also needed to make sure everyone can find everything they need within the system! I used to think Davis is the most environment-friendly city where you can go to many places by biking and buses, but still I had to drive a long way to visit malls, museums, and some grocery stores. In SF, however, I am pretty sure that I can find anything I need on BART/Muni routes!

 

It is also quite amazing that even the buses in SF are powered by electric! and that was before we saw any electric or hybrid car! so now SF is ready to connect the already electrified system to the renewable energy! That is why I know deeply that if I want to change the world to make it a better place to live, I need to start right here at San Francisco!

 

However, before I can get any interview, I need to submit all the essays below for my job applications, but I like writing anyway, so hopefully the hiring managers will feel my passion of working there from my words!



 

1. Explaining forensic toxicology concept

 

When I was a graduate student at UC Davis, I often used forensic toxicology concept to help people to understand more about my dissertation research.

 

First, I used 1H and 13C nuclear magnetic resonance (NMR) technology to study protein-lipid interaction and its role in metabolism regulation and bioenergetics. Since every biomolecule has its distinct NMR spectrum, we can detect any newly formed molecules in the biochemical reactions from the spectrum. The application of NMR in our research is very similar to how we find the toxins that cause the illness or death of a person in a criminal event. Indeed, NMR and the other spectroscopic methods can all be used in forensic toxicology to identify an unknown toxin from its unique signature in the spectroscopic analysis. For example, carbon monoxide poisoning can be detected from the NMR spectrum of the muscle tissue where the valine E-11 gamma-CH3 peak of myoglobin would shift from -2.76 to -2.26 ppm when the oxygen bound to myoglobin is all replaced by carbon monoxide.

 

Second, in my research laboratory, we actually used toxins to help us to understand more about metabolism regulation. For example, carbon monoxide was commonly used in our studies, because it can inhibit oxidative metabolism by competing with oxygen to bind oxygen-transporter proteins such as hemoglobin and myoglobin. Sometimes we use toxin to create the more stable alternatives of our native proteins. For example, myoglobin bound to carbon monoxide has almost the same structure as oxymyoglobin but becomes much more stable in the same environment. What makes toxins harmful to our bodies, such as their ability to compete with our metabolism process and their irreversible effect after forming very stable complex with our biomolecules, actually makes them very useful in biomedical research. Although the role of toxins in my dissertation research is quite different from in forensic toxicology, I have learned a lot about the mechanism of toxins from my research.

 

Third, beside the adverse effect of the toxins, how they are distributed in our bodies is also very important. A lot of effort in my dissertation research was used to build mathematical models for the intracellular fatty acid transport, and toxins, like fatty acid, also have their distinct ways to be transported throughout the body. For instance, if the toxin is water soluble, it would be very likely to be transported from the digestive system to the blood circulation in order to reach the target organ, and the adverse effect of the toxin would be diminished when it is gradually carried away by the circulation, metabolized by the liver, and excreted from the kidney. However, if the toxin molecule is not water soluble like fatty acid, it can be easily transported through the cell membranes, such as through skin exposure, to the target organ, and it would also be hard to be cleared out from the blood circulation so the adverse effect would be more severe. Therefore, from how the toxins are transported, where they are distributed and how long they have stayed there, we can obtain more detail information about them. This would be very helpful in forensic toxicology when we want to know how and when the victims were exposed to a toxin.

 

  

 

2. Quality programs participated

 

In my current job, I have been responsible for creating the quality control protocol of the high throughput robotic system (Hamilton Microlab Starplus) used in the newborn genetic screening assay for spinal muscular atrophy, based on the guideline of Clinical Laboratory Improvement Amendments (CLIA) which is similar to ISO/IEC 17025. The robotic system performs both DNA extraction from the newborn dried blood spots (DBS) and preparation of the DNA samples for the subsequent qPCR reactions. I have also participated in the quality program of the reagents and reference materials (the control samples with known DNA concentration) used in the assay. The three aspects of the quality program are validation, verification, and traceability of the collected data.

 

In the validation process, we want to check if the robots do what we expect them to do. The two important concepts in the validation process are precision and accuracy. For the precision, we want to check if the output values from the robots are consistent. This is indicated by the ratio of the standard deviation to the mean of the measured data. For the accuracy, we want to know if the measured values are close to our expected values. This is indicated by the percentage difference of the expected value and the mean of the measured values. The robotic system is validated when both the precision and accuracy values fall within the ranges of our criteria. In my validation protocol, I measured the temperature and volume of the heating and cooling units for 5 days and calculated their precision and accuracy values. After the robots were validated, I teamed up with my coworkers to measure the precision and accuracy of the final sample volumes based on the results from the qPCR machine, the threshold cycle (CT) values that represent the amount of the gene detected in the samples. Since there is no expected CT value, the volume accuracy is measured by comparing the CT difference between robotic and manual pipetting. For validating the reference materials and reagents used in the assay, only the precision values are measured from the collect CT data.

 

In the verification process, we want to check if the different robots used, or the different reagent or reference materials used, give us the same results. For the robot verification, it will be sufficient if both precision and accuracy data for the temperature and volume of the different robots fit the criteria. For the reagent and reference materials, the CT values from the old samples, the new samples, and the combination of the two groups should all have their precision measurements fall within the same criteria. For reagent verification, it is important that we have to do it whenever any of the reagents with the different lot numbers provided by our suppliers is used.

 

The traceability of the collected data is also very important in the quality control, especially during the troubleshooting process. In the quality program for the robots, the previous validation and verification data are all carefully documented, and all the processed samples are recorded in our database system. Collaborating with my team and Hamilton engineers, we have created a workflow of the patient sample data that goes from the specimen barcoding system to the robotic system and subsequently to the qPCR machines. This ensures if there is any error in the process, it is easily to trace back which sample is affected.

 

 

 

 

3. Analytical instrumentation techniques

 

My experience, training, and skills using analytical instrumentation can be categorized into three parts: spectroscopic, microscopic, and biomolecule isolation methods.

 

I have been using spectroscopic methods extensively during my dissertation research. I used nuclear magnetic resonance (NMR) spectroscopy to study protein structure and function, in particular protein-lipid interaction and its role in metabolism regulation. I also used NMR to characterized different types of fatty acid and lipid molecules and measure their solubility and aggregation states in aqueous solution. I also helped my colleagues to find the distribution of the different isomers formed in sialic acid synthesis from their NMR spectra. In addition, I used UV-Vis or fluorescence spectroscopy quite frequently to characterize the protein species and measure the concentration of protein, nucleic acid, and the other biomolecules based on their light absorbance. For instance, when the oxyhemoglobin in the blood is exposed to carbon monoxide, the peaks at around 540 and 576 nm on the visible light spectrum would shift to around 538 and 568 nm. I have received rigorous training in sensor technologies for acquiring signals and images from biochemical samples and exporting the obtained data for quantitative and statistical analysis during my graduate studies in biomedical engineering, so I am confident of myself as a technical expert in spectroscopic methods.

 

I used microscopic methods to study the metabolism compartmentalization at different levels of neuron cells and its relation to the neural pathways of brain in my graduate studies at Cal Poly. I used enzyme histochemistry to label the location of the target enzymes (with distinct colors) in order to see their distribution on the brain tissue under microscope. I was also trained to use immunohistochemistry to label the neural proteins on brain tissues under the confocal microscope at USC. With my training in medical imaging from my graduate studies, I am familiar how to use software and programming methods to extract valuable information from the images. For instance, I wrote a Matlab program for the automatic registration of the neuron images taken from the confocal microscope at different times for studying the neuron movement.

 

I have used the biomolecule isolation methods to isolate the target molecules in order to further analyze them. In my current work, I used the high throughput robotic system to isolate DNA molecules from newborn dried blood spots (DBS) in order to further screen them for spinal muscular atrophy (caused by lack of SMN1 gene) with qPCR. The extraction process includes high temperature incubation to release DNA from white blood cells and several steps of centrifugation to separate DNA from the other substances in the blood. I also worked on DNA extraction when I was a teaching assistant for a molecular biology laboratory at UC Davis. In addition, I help the students to run agarose gel electrophoresis in order to identify the PCR products based on the size of the DNA molecules. After the electrophoresis is finished, the DNA bands were stained with ethidium bromide can be visualized under a fluorescence chamber. Furthermore, I am also familiar with extraction of protein and lipid from blood and tissue from my dissertation research. I used a variety of purification techniques, such as column chromatography (affinity, size exclusion, and thin layer chromatography), membrane filtration, dialysis, extraction with organic solvent, and evaporation.

 

 

 

 

4. Drugs and alcohol related experience

 

In my work at Company O, I was developing multiplex lateral flow immunoassays for urine and saliva testing for the drugs of substance abuse and enzymatic colorimetric assays for detecting alcohol in saliva.

 

The multiplex immunoassay device can be used to detect six types of the drugs for substance abuse: amphetamine, opiate, cocaine, MDMA, benzodiapenes, and THC. During the drug testing, the saliva travels to the nitrocellulose membrane and carries the antibodies of the drugs conjugated with the colored gold nanoparticles on a pad attached just below the membrane with it due to the capillary effect. The drugs are already sprayed on the membrane (they are bound to the membrane and not free to move), so if there is no drug in the saliva, the antibodies would bind to the drugs on the membrane and form color bands. However, if the drugs are presented in the saliva, they would compete with the drugs on the membrane for the antibody binding, so there is no color band. We can identify which of the drugs are in the saliva by the absence of the color bands at the distinct locations on the membrane. The most important information needed for developing the immunoassay is the cutoff concentration of the drugs that is often set up by the government agencies. I would adjust the concentration of the nanoparticle conjugated antibodies on the device based on the cut-off concentration. In general, the drugs in the saliva should be detected at 1.5x the cut-off concentration (as the limit of detection) and not be detected at 0.5x the cut-off concentration (in order to minimize the cost of the reagents). In addition to the concentration of the antibodies, it is also important to achieve the optimal conjugation between the antibodies and the nanoparticles by carefully adjusting the pH values of the solution. If different drugs are going to be detected on the same membrane, I also have to carefully adjust the antibody concentration to avoid the cross reaction that happens when the antibody used for a certain drug interact with the other drugs.

 

I also worked on the enzyme colorimetric assays for detecting alcohol in saliva. The assay has similar format as the immunoassay, but the antibodies are replaced with two enzymes and their substrates except alcohol. One of the enzymes oxidizes the alcohol into its metabolites and the other mediates a coupled reaction to reduce one of its substrate molecule into a colored product that is used as the indicator for the presence of alcohol. The quantity of enzymes used in the assay is also based on the cut-off concentration. The main challenge in developing the enzyme colorimetric assay for alcohol is that the shelf life of the enzymes is much shorter than the antibodies used in the immunoassays.

 

I also spent a lot of time in hapten synthesis for developing the new THC antibodies that can recognize more types of the THC metabolites in saliva and urine, so the assay could be more sensitive. The hapten is a molecule that is structurally similar to the target drug so that when it is injected into the bloodstream of an animal, it can stimulate the production of the antibodies for the target drug. Although my hapten synthesis project was not successful due to the financial constraint of the company, I have acquired a lot of knowledge about drug metabolism while conducting the literature research for my work.

 

 

 

 

5. Improved service that subsequently led to better outcomes

 

When I worked in Company D, I helped one of our suppliers to troubleshoot the issues in their products.

 

We have ordered the blue latex beads (300nm diameter polystyrene particles) from this company for developing our lateral flow hybridization assay to detect the oligonucleotides in our DNA ink product used for counterfeit prevention purpose. The sequence of the oligonucleotides in the DNA ink functions like a barcode that is used to authenticate the important documents such as personal checks. Accompanied with the DNA ink, the lateral flow detection device containing the oligonucleotides complementary to those in the DNA ink is used to detect the presence of the DNA ink. When the oligonucleotides from the DNA ink and the detection device hybridize, the latex beads would be stuck on the location of hybridization and form a blue band on the device as the indicator of the specific sequence in the ink.

 

We had used the latex beads from this supplier and got very excellent results in the assay development for more than 6 months. However, when the company sent us the beads with a different lot number, we found they were not functioning at all. They could not travel through the nitrocellulose membrane on the device. While using Malvern Particle Size Analyzer to measure the bead size with dynamic light scattering, we found the latex beads were seriously aggregated. The supplier agreed to refund our purchase and ask us to send the beads back to them so that they could figure out the problems. However, after a few weeks, it sent us a report saying that they have tried to use sonification to break the aggregated beads but it did not solve the issue.

 

Thinking that it would cost much more if we bought the latex beads from the other companies, I decided to help the supplier to troubleshoot the issues. With the help of my colleagues, I found a very old sonicator in my company. I put the bead solution in the 1.5 ml tube and put the tube inside the water bath of the sonicator but it did not work. When I checked my experiment conditions again, I just realized that maybe the wall of the tube was too thick for the ultrasound wave to penetrate into the bead solution. Therefore, I looked around for thinner containers, and I found the plastic roll used for packaging in my company was very thin, so I cut it and used the heat sealer to make a plastic bag. I put the aggregated latex bead solution into the plastic bag, sealed the opening, and let the bag sit in the water bath of the sonicator for 90 minutes. It was working and the beads were no longer aggregated! The sonicated beads were also working very well for detecting DNA hybridization on the device. I then wrote a protocol with all the details of my solution and sent it to the supplier.

 

I felt very rewarded when I was able to help the supplier company to troubleshoot the serious issue. At that time, I did not expect that I would be asked to synthesize our own latex beads for my company and what I learned about sonication truly helped me a lot for the project. Therefore, we should always help the others as much as we can, because it can be an opportunity for us to learn the new things and broaden our knowledge!

 

 

 

 

 

Another job position:

 

 

 

 

1.Experience in performing quantitative and qualitative analysis (related to Environmental Protection Agency (EPA) or other approved methods)

 

In my current work, I have been responsible for creating the quality control protocol of the high throughput robotic system (Hamilton Microlab Starplus) used in the newborn genetic screening assay for spinal muscular atrophy (SMA), based on the guideline of Clinical Laboratory Improvement Amendments (CLIA). The accreditation body of CLIA is Centers for Medicare & Medicaid Services (CMS), and our facilities are inspected by CMS every 2 years. The robotic system performs both DNA extraction from the newborn dried blood spots (DBS) and preparation of the DNA samples for the subsequent qPCR reactions. In the qPCR runs, the amount of the SMN1 gene at the start of the PCR amplification is measured semi-quantitatively as the threshold cycle (CT). Based on the CT values of the samples, I can determine if they are SMA-positive (lacking of the SMN1 gene).

 

In my previous work at Company O, I was developing multiplex lateral flow immunoassays for urine and saliva testing for the drugs of substance abuse. Our facilities were ISO-13485 approved. The multiplex immunoassay device can be used to detect six types of the drugs for substance abuse: amphetamine, opiate, cocaine, MDMA, benzodiapenes, and THC. During the drug testing, the saliva travels to the nitrocellulose membrane and carries the antibodies of the drugs conjugated with the colored gold nanoparticles on a pad attached just below the membrane with it due to the capillary effect. The drugs are already sprayed on the membrane (they are bound to the membrane and not free to move), so if there is no drug in the saliva, the antibodies would bind to the drugs on the membrane and form color bands. However, if the drugs are presented in the saliva, they would compete with the drugs on the membrane for the antibody binding, so there is no color band. We can identify which of the drugs are in the saliva by the absence of the color bands at the distinct locations on the membrane.

 

 

 

 

2. Experience and maintenance of general laboratory equipment and analytical instruments

 

In my current work, I am responsible for the maintenance of the robotic liquid handling system and its accessories, including the incubators, cooling units, plate sealers/peelers, and centrifuges. I would run calibrations and volume testing every month and schedule the more comprehensive annual/semiannual preventative maintenances with the suppliers of the equipment. Since we are running the human specimens on the robots, I have to take precaution when touch the surface of the robotic system and treat all the samples as biohazard material.

 

In my research at UC Davis, I also did most of the maintenance work in the laboratory. I ran calibrations for the UV-Vis spectrometer, pH meter, and size exclusion column chromatography. Since I used nuclear magnetic resonance (NMR) machine (Bruker Avance 600 MHz spectrometers) to study protein-lipid interactions, I had to do a lot of calibration, such as tuning and shimming, to optimize the NMR data collection process. The most important safety procedure is not to bring any metal near the machine because of its high magneticity. I also had to be very careful while changing the probe that was at the bottom of the machine because of the high voltage nearby.

 

From my research, I found myoglobin can bind fatty acid from the intensity change of the myoglobin heme 8-methyl peak. However, in one of my experiments, there was no any change in the peak no matter how much fatty acid I mixed with the myoglobin solution. To troubleshoot the issue, first, I checked if my samples were contaminated, but I did not find any signals from contamination in the spectra. Second, I tested the pipettes that I used to prepare my samples, but I found there was also no error in the dispensed volume. Third, I checked the temperature of my myoglobin and fatty acid samples, but there was still no any error found in the sample temperatures.

 

Therefore, I suspected the temperature inside the NMR 600 Hz machine that I was using to collect my data might be wrong. To test this hypothesis, I ran my samples in both the NMR 600 Hz machine and the NMR 400 Hz machine, set at exactly the same temperature, and the NMR signal indicating the change of the peak intensity did only show up on the spectra collected on the NMR 400 Hz machine. The test proved that I was right about the temperature error inside the NMR 600 Hz machine. By collecting data at different temperature settings, I found that the temperature inside the NMR 600 Hz machine was off by 7 Celsius degree. As a result, after adjusting the temperature setting of the NMR machine, my experiment data were consistent with the previous data.

 

 

 

 

3. My general knowledge and experience of Quality Assurance and Quality Control (QA/QC)

 

In my current job, I have been responsible for creating the quality control and assurance protocol of the high throughput robotic system (Hamilton Microlab Starplus) used for the newborn genetic screening assay for spinal muscular atrophy (SMA), based on the guideline of Clinical Laboratory Improvement Amendments (CLIA). The robotic system performs both DNA extraction from the newborn dried blood spots (DBS) and preparation of the DNA samples for the subsequent qPCR reactions. I also participated in the quality program of the reagents and reference materials (the control samples with known DNA concentration) used in the assay. The two most important aspects of the quality program are validation and verification.

 

In the validation process, we check if the robots do what we expect them to do. The two important concepts in the validation process are precision and accuracy. For the precision, we want to check if the output values from the robots are consistent. For the accuracy, we want to know if the measured values are close to our expected values. The volume and temperature values of the robotic system is validated when both the precision and accuracy fall within the ranges of our criteria. For the reagent and reference materials, since there are no certain expected values for them, only the precision studies are performed.

 

In the verification process, we want to check if the different robots used, or the different reagent or reference materials used, give us the same results. For the robot verification, it will be sufficient if both precision and accuracy data for the temperature and volume of the different robots fit the criteria. For the reagent and reference materials, the measured values from the old samples, the new samples, and the combination of the two groups should all have their precision measurements fall within the same criteria. The 95% confident interval is the most used criteria.

 

When the robotic system was set up, the dispensed volumes was not within our acceptable QA/QC criteria. I worked with the vendor to edit the liquid class, a set of programmable parameters for controlling the operation process of the liquid handler such as the speed of pipetting, the requirement for prewetting the tips or aspirating some air to prevent the liquid inside the tips from dripping away. With my knowledge in chemistry and fluid dynamics, I made some general hypothesis for the parameters values and I ran a lot of volume dispensing tests to confirm my ideas and optimize the values. After that, I conducted the validation process required by CLIA to conclude that the volumes dispensed by the robots are within the QA/QC criteria.

 

 

 

 

4. Experience in interpreting and summarizing data collected from analytical instruments

 

In my dissertation research, I used nuclear magnetic resonance (NMR) machine (Bruker Avance 600 MHz spectrometers) to study protein-lipid interactions, in particular the interaction of myoglobin (Mb) and palmitic acid (PA).

 

First, I showed that myoglobin (Mb) binds to palmitic acid (PA). To measure the amount of fatty acid soluble in the aqueous solution, I used Sodium-3-(trimethylsilyl) propionate 2,2,3,3 d4 (TSP) as the internal concentration and chemical shift reference. On the NMR spectrum, since there are 24 protons in the methylene peak of PA at 1.2ppm and 9 protons in the trimethyl peak of TSP at 0 ppm, the concentration of the dissolved PA can be calculated from the concentration of TSP by [PA]= (area of the methylene peak/24) / (area of the trimethyl /9) x [TSP] (when all the protons are fully relaxed in the spectrum). Because of the long hydrocarbon chain, PA is barely soluble in aqueous solution, but at the presence of 0.2mM Mb, the concentration of the dissolved PA increased more than 40 times. A reasonable explanation is that PA bound to Mb and became soluble.

 

Second, I showed the location where Mb binds to PA. Because of the de-shielding effect of the iron on the Mb heme group, the protons around the hydrophobic heme pocket (where fatty acid is most likely to bind) can be separated from the rest of the protons in Mb. We noticed the selective intensity loss of the 8 heme methyl peaks at the presence of PA that supports the hypothesis that PA interacts in a local structural region of Mb, and the interaction would reduce the heme methyl mobility, which changes the relaxation rate and broadens the peak beyond solution state NMR detection. By calculating the ratio between the heme 8-methyl peak height with and without PA and multiplying it by the myoglobin concentration, I can estimate the concentration of the bound PA: [Mb-PA] = [Mb] x ([Mb-PA]/[Mb]) = [Mb]*(peak height ratio of Mb with and without fatty acid). By titrating Mb with increasing amount of PA, I can calculate the free and Mb-bound PA at the different concentration of added PA and fit the data with the ligand binding equation:

 

[Mb-PA] = Bmax / ([PA]+Kd)

 

where [PA] is the unbound PA concentration, Kd is the dissociation constant, and Bmax is the maximum capacity of PA binding (the Mb concentration).

 

With the Kd obtained from the NMR spectra, I was able to build a mathematical model for intracellular fatty acid transport and show the import role of myoglobin in fatty acid metabolism. The results of my dissertation research have been published in 5 academic journal papers.

Comments

Popular Posts