I first heard about C3.ai from a friend who loves to listen to NPR—he’d heard the C3.ai radio ad there. I knew I wanted a unique summer internship experience that would complement the cutting-edge ideas in machine learning floating around in my classes. And more importantly, I wanted an internship that involved meaningful work that you couldn’t always get at a big company. The more I learned about C3.ai from that friend, from my own research on the company, and from a career fair visit on campus, the more I found that C3 IoT was a perfect fit.
I joined C3.ai as a 2018 summer intern because of my interview experience. Every single individual I interacted with was incredibly interesting, thoughtful, and genuinely curious about my previous work and education at school. It was immediately clear that an internship at C3.ai meant meaningful, challenging, fun work with an even better team.
And that’s exactly what it was.
On my second week at C3.ai, I dove into a project that several data scientists had their eyes on – a pytest environment with access to the C3 Type System. Right off the bat, I learned about the C3 Type System, the C3.ai development process, and the team culture. By the end of the two-week project sprint, I had produced a testing suite that would allow C3.ai data scientists to code in Python and test locally in Python. I then went further to make the C3 Type System accessible through any Python REPL for quick experimenting and development.
After tackling that short project, I began the main focus of my internship – investigating and building platform support for the Open Neural Network Exchange (ONNX) format. ONNX is a very new open source effort initiated by Facebook and Microsoft in 2017 that encourages interoperability between deep learning frameworks. It makes deep learning models more accessible by providing a single format for storing the models and export/import functionality to a vast variety of frameworks. With ONNX, a data scientist can develop a model in any framework of her choice and export it to ONNX for another data scientist to import the model and run an inference from any framework of her own choice. Super cool stuff!
I focused on enabling support for importing pre-trained ONNX models into the caffe2, mxnet, and tensorflow frameworks, along with building scalable, composable machine learning pipelines using the C3.ai ML Pipeline 2.0 to run inference on these imported models. After successfully accomplishing this for a simple ImageNet classifier from ONNX, I worked on implementing an end-to-end set of facial detection and facial recognition algorithms called ArcFace. For both, I defined several C3.ai Types and made the pre-trained ONNX models reusable C3.ai objects in our ML Pipeline 2.0, so that any data scientist can easily launch an inference on their own data, or perhaps expand on the C3.ai Types I built to run inference on their own pre-trained ONNX models. As a final touch, I leveraged the ArcFace model to generate face embeddings of our very own C3.ai employees and presented a few fun examples (including a “Find Your Perfect Match at C3.ai” example application and a comprehensive similarity graph!) of what can be achieved in the C3 AI Suite through pre-trained models from ONNX.
Throughout my internship, I was constantly impressed by how interesting and open the people are at C3.ai. Each interaction – with a team member in a 9 am discussion session, a data scientist testing my pytest environment, or CEO Tom Siebel over lunch with all the interns – was precious and thought-provoking. Weekly lunch-and-learns and team bonding events – like kayaking on a Wednesday morning before work – made it so easy to get to know so many people at C3.ai and learn from each and every one of them. And having a superb, insanely smart and knowledgeable, passionate mentor guiding me through the entire process was the cherry on top to a summer to remember, a summer that would be difficult to top.