My teaching at George Mason University (GMU) focused on Computer Engineering.
My teaching schedule includes the following courses:
- ECE448: FPGA Design with VHDL: This course uses the Digilent Basys 3 board and teaches the students how to communicate with UART, VGA, USB, GPIO. Students learn the data/control at the deep-down bits-and-bytes level. Three major differences of FPGAs from anything else that the students have learned so far are emphasized: 1) How logic gate-based digital implementation differs from Lookup Table (LUT)-based FPGA implementation, 2) how programming languages like C that --by definition-- have state-holding variables differ from hardware languages like VHDL, which have state-holding as well as wire-type variables, and 3) the concept of "now" versus "after the arrival of the positive edge of the clock".
- ECE555: GPU Architecture and Programming: This course uses GMU's Hopper cluster to teach the Nvidia CUDA programming language. Students learn how to think "massively parallel", which is a programming paradigm that allows programmers to launch millions of threads, rather than just a handful, as in CPU parallel programming. Extreme emphasis is placed on how an efficient CUDA program can drastically improve the data movement and processing at the bits-and-bytes level inside the GPU cores, resulting in 2x, 5x, 10x higher overall performance.
- ECE350: Embedded Systems and Hardware Interfaces: This course uses a Raspberry PI 4 board, a PI camera module and Adeept parts kit. Students are introduced to bus protocols such as SPI, I2C, CAN, and UART. Projects are developed in an Embedded Linux environment and include the utilization of motors, LCD and Seven Segment displays, WiFi, and an array of sensors.
- ECE655: Advanced GPU Programming and Deep Learning: This course uses Python and Py Torch to teach students
neural network programming and real-time object recognition and classification. Available Convolutional Neural Network (CNN)
models (e.g., ResNet, Inception, and DenseNet) are introduced and their training/inference are studied in depth.


