Ivan Nonveiller is currently driving Nebula AI’s global go-to-market strategy in the artificial intelligence industry. He works with startups on business acceleration and growth and has been advising some of the largest Canadian brands on marketing innovation and technology. 

We met after his talk on attribution modeling at the Digital Transformation & Green Energy Innovation Summit in Montreal.

What made you decide to take on Nebula AI’s marketing?

About a year ago I was having a coffee at Humble Lion, with a friend who is doing AI research at McGill University, and he gave me a staggering statistic. He said that the need for AI computational power doubles every 3.5 months. I couldn’t believe it, so I went online to double check. I found out that this info was coming from OpenAI, a San Francisco-based AI research institute founded by Elon Musk. It was correct. So when Charles Cao, the founder of Nebula AI, offered me a role for a company that’s building a decentralized artificial intelligence platform, I couldn’t say no. Charles was on the right side of history. Data privacy and protection, are maybe the most important issues in the digital space at the moment and decentralization of computational AI resources will play a big role in it.

What is your biggest challenge with Nebula AI right now?

I would say that our biggest challenge is to tell the story of a complex product in layman terms and to stimulate adoption. Nebula AI is a creative, cutting-edge software lab with a team of 20 developers. They used ten different programming languages to build Orion: Go, Java EE, Node.js, NoSQL, MySQL, Solidity, Python, C and C++, and blockchain. However, the final product is pretty simple. Basically, Orion is a platform that connects GPU owners with AI users, process payments and dispatches AI tasks. It costs ⅓ of what centralized AI providers charge, and it does for GPU cards what Uber did for cars.

Can you summarize your pre-launch marketing efforts at Nebula AI?

Since our technology is so innovative, we wanted to let people know what we are building months before our product launch. So we started with awareness campaigns pretty early and managed to build large support groups via Twitter, Telegram, Reddit, WeChat and Weibo.

We did few events, some PR as well as some influencer marketing, but we mostly focused on creating user-generated content campaigns in North America, Russia, and Asia-Pacific. Youtube now has over 500 videos dedicated to Nebula AI, and less than 15 are our creation. All the other videos were put together by supporters. Ultimately, my job is to help shape Nebula AI into a household name. Our pilot platform will enter the market next week, and the AI community has already reacted in a positive way.

What is your current marketing structure?

We run a hybrid inbound / ABM (account-based marketing) program. I would say that it’s 20% content marketing and 80% is explicit outbound target accounts. Inbound takes a long time. It is essentially a long tail strategy. It’s a necessary piece and a key driver for customer acquisition, but it cannot be a stand-alone tactic.

The thing is, we don’t have time to wait. So what we do is we equip the business development reps with content that’s catered to their targeted segment and we target the accounts with ads before the reps reach out to them. So our strategy is content driven but at an account level. Our content is specific and narrow enough to appeal to the kind of buyer we want to sell to.

We also look if the content is read and if it is ranking for the keywords we care about. We track cohorts of keywords that we want to rank for in order to see which articles are performing best. That allows us to send good signals to Google. Also, not only our content but the whole domain authority begins to get rewarded. We also try to get feedback from the sales on which piece of content was read by the customer.

Why did you decide to talk about marketing attribution today?

One of my mottos is: tell me how you measure me, and I will tell you how I will behave.

I believe that a meaningful percentage of poorly performing marketing teams struggle because they have been given wrong KPIs. Marketing has to generate demand. It has to take care of the customers, to retain them and to make them happy. Marketing has to take care of the brand and the community around it. Finally, marketing has to prepare customers for the future vision and change. Attribution allows executives to realize that lead-centric or lead-only KPIs for marketing departments are not enough. A lead is just one unit of demand and there are multiple units of demand depending on companies.

What I am generally trying to do is to show that we can identify a number of engagement signals that are either created or detected by marketing, stratify them into, for example, “contact same day”, “contact next day” etc., and communicate it to sales so they can close deals faster. As an example, if we get a signal that one of our users has been very active with our chatbot, then marketing should integrate that activity into the CRM. It should trigger a “contact next day” high-conversion alert so that the user is contacted within 24 hours.

For people that are a bit less familiar with AI, could you tell me why we need it?

The society is growing at a massive rate. An increase in scale also increases entropy in systems and requires a large number of tasks to be automated. AI can process many tasks much faster than humans do. It can automate technical support, manufacturing, medical detection, translation, marketing —and that’s only the beginning.

However, the reason we need AI the most today is that we need to make sense out of data. As you know, new forms and sources of data are coming from the web, mobile devices, social media, IoT sensors, video/audio content, networks, log files, and so forth. Much of it is generated in real time and on a very large scale. These data sets are high in volume, in velocity, and in variety. Their size or type is beyond the ability of traditional relational databases to capture, manage and process with low-latency.

With that being said, we can apply AI tasks such as machine learning, predictive analytics, and natural language processing against large, diverse and previously untapped data sources. There are different types of data, such as structured, semi-structured and unstructured data, but that’s a bit beyond the point. Artificial intelligence helps businesspeople, analysts and researchers gain new insights and make better decisions.

What is the current state of AI?

Right now, artificial intelligence is known as narrow or weak AI. It can only perform very specialized tasks (e.g. play chess, internet searches, facial recognition). In the long term, however, many researchers are hoping to create an artificial general intelligence (AGI), where a machine could successfully perform any cognitive task that a human being can. There are already over forty organizations worldwide that are doing active research on AGI.

The most difficult AI problems are informally known as AI-complete, meaning that solving them is equivalent to the general aptitude of human intelligence. AI-complete problems are hypothesized to include general computer vision, natural language understanding, and be able to deal with unexpected circumstances.

We often talk about deep neural networks (DNN) in discussions of AI. Can you explain what DNNs are?

DNNs are deep learning architectures that are inspired by information processing and communication patterns in brains or biological nervous systems. But even though the artificial neural networks were originally inspired by neuroscience, many of the major developments in machine learning was guided by insights into the mathematics of efficient optimization, rather than neuroscientific findings. Neuroscience and machine learning have not converged yet. Deep learning models have many differences from the structural and functional properties of biological brains.

Of course, neuroscience is dealing with a very different language than machine learning. Neuroscience is dealing with molecular biology, genetics, neurotransmitters and mechanisms for computation and information storage. Deep learning models focus mostly on instantiations of one single principle: function optimization. Researchers are mostly still trying to minimize classification errors, which can lead to the formation of rich internal representations and powerful algorithmic capabilities.

However, DNNs are already producing results superior to human performance in fields such as natural language processing, computer vision, speech recognition, social network filtering, bioinformatics and games such as chess or Go.

The Canadian Digital Transformation & Green Energy Innovation Summit took place in Montreal, Canada on October 17, 2018.

You can learn more about Nebula AI at

For more on the Orion Cloud platform, visit

Previous articleWhat Google knows about you & how to take control of your personal data?
Next articleA Complete Security Token Ecosystem
Léandre Larouche
Léandre is a student, writing coach, and freelance writer based in Montreal, Canada. He is finishing an undergraduate degree at Concordia University, writes content for tech companies in Canada and the U.S., and helps clients improve their writing skills. Previously, he has held positions as a resume editor and publication manager and studied at The University of Nottingham, England. Léandre's interests include artificial intelligence, blockchain technology, cryptocurrencies, and big data.


Please enter your comment!
Please enter your name here