Research Assistant for AI-based screening of Cervical Cancer [Closed]
Please prepare a detailed CV and cover letter as a single pdf and upload it in [Form Closed] APPLICATIONS WITHOUT COVER LETTER WILL NOT BE EVALUATED!
The position is open until it is fulfilled; we will shortlist and interview potential candidates every week starting from May 2, 2022, until the position is fulfilled.
We are looking for a Research Assistant to work on a research project on , “AI-based screening of Cervical Cancer”, in partnership with Institute for Implementation Science and Health (IISH). We will collect as part of the research, Visual Inspection with Acetic Acid (VIA) images of cervix of women in Nepal which will be annotated by expert medical professionals. The goal is to research, develop and validate smart-phone friendly deep learning based methods to aid non-experts take VIA images and assist them in decision making during the VIA screening.
Cervical cancer is the second most common cause of cancer death in women worldwide, with 85% of cases and cervical cancer deaths occurring in low-and middle-income countries (LMICs). In low-resource nations, visual inspection of the cervix with acetic acid (VIA) is one of the most used techniques for cervical cancer screening test. Screening with the VIA test is regarded as the most viable and practical technique for early diagnosis of cervical cancer, and the most beneficial in terms of, for instance, lowering disease burden at a reasonable cost. However, the lack of well-trained medical human resources, and the subjective nature of the test, have added hurdles to VIA’s broad reach. The goal is to address this problem with an automatic and more consistent deep learning based VIA screening tool.
The project is expected to yield high quality research publications with both application and methodological novelty.
You will work with:
- Eager to learn, hardworking attitude, curious mind and sincere.
- Fluency in Python and Python environment for scientific computing and machine learning particularly numpy and scikit-learn.
- Fluent in one or more of the popular deep learning frameworks such as Pytorch or Tensorflow.
- Experience in training convolutional neural networks with reasonable understanding of basics such as why CNNs are common rather than fully connected networks, regularizations, overfitting and bias vs variance tradeoff.
- Experience (or willingness & interest) to explore smaller deep learning models that could run on mobile phones.
- Good performance in relevant courses such as linear algebra, computer vision, machine learning, image processing and statistics.
- Good proficiency in communicating methods and results of experiments.
- Experience in state-of-the-art image classification CNN and trasnsformer-based networks.
- Experience (or interest) in deploying mobile-friendly low-computational cost models.
- Knowledge of the importance of regularizations and be well aware of key challenges such as generalization and explainability.
- Good skills in scientific writing in English and in visualizing experimental results with graphs and figures.
- Experience in git version control.
- Implement or adapt existing pipelines and basic software platforms to collect microscopic imaging data.
- Implement existing SOTA object detectors for in-house microscopic images dataset.
- Identify and document key challenges in using existing models for low-cost smartphone-based microscopic images.
- Draft research paper under the guidance of the supervisors on the performance of deep learning object detectors and their role in improving the efficacy of smartphone-based microscopes.
- Explore novel and better methods for small object detection that could be implemented in smartphone microscopes.
- Communicate research results to the larger communities through publications in international conferences and journals.
Minimum Required Qualifications
Bachelor in Engineering or Computing Sciences
Employment Duration and Salary
- 15 months
- Full-time position
- Salary: Depending on expertise and experience.
Dr. Bishesh Khanal (Email: email@example.com)