Position @ The University of Edinburgh, United Kingdom
PhD Student, Software Modelling/Optimization for AI Computing Architectures
The project aims at exploring the potential benefits of employing software optimization techniques for modelling and training Large Language Models (LLMs), with a specific focus on Transformers, targeting various unconventional hardware architectures and computing domains (Binary, Analog, Bitstream). The project targets developing Python-based libraries for training and inference that are friendly to unconventional computing domains. These libraries should be eventually integrated with PyTorch and/or TensorFlow to facilitate modelling and quantization-aware training for different AI hardware architectures.
The required skills are as follows:
* Excellent programming skills using Python (mandatory).
* ML/AI modelling using PyTorch or TensorFlow (mandatory).
* DNN quantization and pruning techniques.
* TCL and Makefile scripting.
* Basics of computer architecture.
For informal enquiries, please contact Dr. Shady Agwa, email: shady.agwa@ed.ac.uk
#J-18808-Ljbffr