#1-ranked iPaaS on G2 among 236 competing solutions

On Demand Webinar

Automate Data Warehousing to Deliver Insights at the Speed of Light Across the Enterprise

Introducing Celigo’s new Data Warehouse Business Process Automations for Snowflake and BigQuery.

When the right people lack access to the right data from the right sources in the data warehouse, business decisions get delayed. Long development projects, dependency on scarce engineering teams, outdated technology, and ever-changing data pipelines add to the frustration.

To address this challenge, Celigo launched the Data Warehouse Business Process Automations for Snowflake and BigQuery. Join us as we demonstrate how to accelerate insights by delivering easy to implement, reusable data pipelines for extracting and loading of data from multiple SaaS applications into Snowflake and BigQuery — all within a single product.

Watch now!

Full Webinar Transcript
Okay. Let’s get started here. Hello again, my name is Abrul, and I’m a product marketing manager at Celigo. So this week, we are hosting a series of virtual events for our new business process automation products. And this is the second event in the series. And so today, we’ll be introducing our new data warehouse business process automation for Snowflake and Google BigQuery. So in this event, you will learn how you can automate data warehousing to accelerate insights across your organization. And I’m here today with senior solution consultant Kelly Izer who is also a business process automation expert. And he will show you both products live in action. So a quick note. So at the end of the demo, we’ll be sharing a limited-time special promotion valid for today’s attendees only. And so to unlock the offer, you’ll need to act before the webinar is over. So stay tuned. And also, we’ll have a Q&A right after Kelly’s demo. So if you have any questions during the presentation or the demonstration, type them into the questions pane and they’ll be answered once it is Q&A time. So, yeah, let me get started here by giving you a quick overview of Celigo. We are an iPaaS, integration platform as a services company. And today, we have more than 450 employees across six global offices and we are highly rated on the industry’s most popular review sites such as G2, Gartner, Salesforce Apex, and many more. And so we have over 4,000 customers and you may recognize some of the logos here. They are very well-known brands. And so for all of our customers today, our platform processes billions of records every month. And as of this moment, our customers are automating more than 40,000 business processes on our platform. And so if you are wondering why more than 4,000 customers chose Celigo, one of the reasons is that our platform is not built only for IT in mind, but also it is built for the business users in mind. And we automate business processes across the entire organization. And that’s not the only thing. So we go beyond integrate anything to anything and we automate, optimize, complete business processes end-to-end with our unique embedded business logic. And what that really means is best practices based on learnings from the industry, learnings from our customers, their automation, and recommendations from software vendors are all built-in so that when you are getting ready to automate the business process, you are not required to be an expert. You don’t have to go and figure out where you need to start, how you need to start, or what your ideal workflow should look like. So we have everything built-in for you in our products. And we are the only iPaaS to do so. So we help organizations accelerate digital transformation. So on our platform, so you have two options. You can either build your own business process automations or you can select from our portfolio of manage, customizable and pre-built business process automations and get easily stared. So, yeah, that was an overview about Celigo and our platform. And now I would like to talk about data warehousing. So you can see here I have two big numbers 23, 19. And so what are these? So a study that was done a couple of years back found that data-driven organizations are 23 times more likely to acquire customers compared to their competitors and 19 times more likely to generate above-average profits. So being data-driven is really the key for staying competitive. And also it is key to drive more revenue, more profits, more growth, and also key to being agile through critical task decisions that are backed by data. And so let’s just quickly go through the business process here for data warehousing and analytics. So what will happen is– so your business teams– they will most likely ask a data analyst to create a report or analyze a piece of their business so they can use these insights for their business decisions. And then analysts– if they have actually all the data they need in the data warehouse or if they identify that they don’t have the actual key source applications connected to the data warehouse, they will go and ask IT or engineering teams to integrate those applications into their data warehouse. So engineering or IT teams– they will develop the integrations. They will test it once it’s up and running, the analyst will have access to all the data they need, and then they will be able to apply their transformation, normalize and aggregate the data. They will analyze, visualize, create reports on it and help the organization to take decisions backed by data. So that was an overview. So I mean, having just gone through the process, it shouldn’t be a surprise to hear that even with the rising adoption of cloud data warehouses, organizations are actually still having difficulties to become a data driven, and I would like you to think about these three questions. Try to answer them. So, number one, are your analysts dependent on engineering resources to have the data they need within the data warehouse? Do long integration projects delay business decisions? And second question is what happens if something changes in your source application? Does all your integration break? So data isn’t even useful when it’s not up to date and so it just adds up in more weight, more teams being involved until things are fixed. And my third question is, so if you think of the ever-increasing adoption of new SAS applications– so for every single request to add a new data source into the data warehouse does a new integration project need to be kicked off by engineering? So I mean, I can imagine your answers to these questions. So when you think of everything, so it is almost impossible with the traditional processes to get data into the data warehouse fast enough to take fast business decisions. So as Celigo, I mean, if you’re already familiar with us you know, we specialize in automating business processes. And we want to help organizations get to data as fast as possible so they can get to the insights they need without delays at the time they need the data. And not just for IT or engineering teams, but also for any data analysts or any business users. We want them to be able to go and implement their own automations. And so based on that, we wanted to build a product. And we knew building point-to-point integrations, but we like Snowflake and Salesforce or Snowflake and Zendesk. And then expecting our customers to install and manage multiple integrations was not going to be the ideal solution. Because I mean, you all know if you’re already dealing with this, there will be always a new data source to add. And an ideal solution is something that can keep up with that, that is easy to use and also easy to manage. So yeah. For the first time, at Celigo. We introduced the Celigo business process automations. And these pre-built products can capture beyond point-to-point integrations. So these can automate key processes. But we now have application and multiple source applications. So in this case, we launched two new products. So one connects anything to Snowflake, and then the other one connects anything to Google BigQuery. And Kelly will, in a minute, show you both products live in action. And we are actually the only I-Pass that can do that through pre-built pre-packaged integrations today. And so here’s a quick look at how it works. So our business process automation extracts and loads data from your selected source applications into a data warehouse through a single automation. And then by leveraging pre-configured applications, so some that are available are displayed on the left here. And also by leveraging more than 350 pre-built connectors we have available on the platform, new source applications can be added into the automation within minutes. And Kelly will show you how it’s done live. So you’ll see for yourself, and then Snowflake and BigQuery best practices. So they’re already built into the product. So as I mentioned before, you don’t have to figure out how to automate where to start. So we’ll extract data from source applications and then we’ll load it into a temporary table, merge it with a production table, and delete it from the temporary table in the data warehouse. And so business users they can easily set up and manage these integrations through the guided user interface. And we have also tools for easy management so that there is no dependency on engineering on IT resources. Setting this and managing this. And so because Celigo only automates the extract and load processes to support the extract load workflows for data warehousing, users actually will be able to transform the data within the data warehouse the way they want using Excel queries. And this way, they will get access to complete data without losing anything, and they’ll be able to create the wish they need based on their specific business objectives. So yeah, having given you this overview, now, I will just hand it off to Kelly. And as a reminder, so if you have any questions, please type them into the questions pane and we’ll be answering them right after Kelly’s demonstration. Also, let me make you the presenter here, Kelly. Thanks, Abru. In order to make the data warehouse and BPA process even easier, Celigo’s created a template that’s available in the integrator IO marketplace. When you enter the marketplace, you can filter to the desired data warehouse, either Snowflake or BigQuery by using the filter in the top right. Once you find the data warehouse application you want to use, select the Tile, click on Preview. Select install now and then proceed. You’ll then be presented with the options and the steps to follow in order to install the BPA. First, we need to enter into credentials for our data warehouse. If it’s a first time you’re coming in here, the application will prompt you for the appropriate credentials to establish your connection. I’ve already set mine up so I can use an existing connection. The application will then ask you which predefined sources do you want to integrate with. Again, these are predefined. It doesn’t mean these are all that you can use. These are just already preset and I’ll show how we can use systems that are not predefined here momentarily. So here you can see I want to do Shopify and HubSpot. So. And then select next. The applications, they’re going to ask us for the credentials to our first application. Again, if it’s the first time, you’ll be prompted for the appropriate credentials. In my case, I’ve already set that up. We’ll then configure and define what resources we want to have sync with our data warehouse. So I’m going to say customers and products. We’ll then put in the credentials for our second endpoint. Again, I’ve already set them up, so I’m going to reuse those here. We’ll configure which resources we want to have synced. Once that’s completed, you’ll see that the BPA process has automatically created the flows necessary to sync the two applications that we’ve selected. Next, we now need to go and configure where we want this data to be backed up. So the first thing you’re going to see is if we come into the worksheet, the databases are already– were not there yet. So we’re going to go in, have the process to find the databases. So first, I’m going to go into my first flow here. We’re going to go to the settings. And here you’ll see, if I navigate down just a bit. Under custom, this is where we’re going to define our temp table, where we want to put the data when we first move it. So here I’ll populate my database name. We’ll then navigate to my second flow. Go to Settings. Now here, we’ll want to have copied the temp table when we created that there because it needs to know where it needs to pull that data from. And then we’re going to come back here. We’ll define the production table we want it to move to. Once that’s completed, we can look at our data warehouse. I’ll refresh the table list. And there you’ll see are the two tables it automatically created for me. If I go up here, select Run Now, you’ll see that currently there is no data there in the production table. So next, I’m going to go back out and finish setting up my other flows here as well. You follow the same steps, settings. Where’s our temp table? We’ll, then you can configure the merge and delete. [silence] Now that that’s complete, we can ex it out. Again, we’ll refresh our table list, see those two new tables were set up. There’s my new Shopify tables. We can now come into the integrations and turn these flows back on. not back on turning them on. All right. Now that we’ve enabled them, that is it. I can now process data. One thing I do like to highlight is that you’ll see that we’ve already linked the merge and delete flow for you automatically. So after the first flow is executed, it will automatically execute the second flow to then move the data. Let me use a custom date to make sure that I’ve got some records here. We can expand the list here and see each step as it’s being executed. [silence] So there will be the first step. And now you’ll see that the second step is automatically being executed as soon. As this is complete, we can go check our database and we should see our data has been populated. [silence] We can now go to our database. Go to our table here, execute that. And now see that we have our data. Over on the right-hand side, you can see it’ll tell you what resources were moved, and then we can click on here and see the details of each of those records. All right. So those were the predefined end points. We can also add in additional endpoints as well that aren’t say, predefined. So if we come over here, the first thing we have to do is what we call register a connection. So in this case, I’m going to add in a Salesforce integration. So I’m going to add Salesforce, register that connection. Then if we come back to the Settings area, you’ll see the unconfigured flows. There is the connection I have now specified. I can save that. And now, if we give this just a moment, it’s going to create the flows for us. And there you’ll see we now have a new set of data flows pointed to our non-predefined endpoint. So I can now follow that same process I had just done. I can go in here. Let me go look at my database names, see what I already have here. You’ll see I don’t have any Salesforce tables yet, so I’ll come in, configure the settings. And we’ll just change. we’ve configured the first step. We’ll now go into the settings for the second step. [silence] And now that we see it’s completed, we can check our database. There’s now my Salesforce tables. I can come back to enable these. Now, the one difference here is when we post that in, I now need to, at least, define what my export is going to be here. So since it’s unregistered, it’s not going to preset that for you. So, again, I can simply come in here, give it a name. And then with Salesforce here, we’re certainly going to write the query. So when we do batch-based flows, we use the query editor. So I can come in here, here in my query, Export All, I select Preview. You’ll see I’m exporting data, so I can save and close that. And now we should be able to run. Again, I’m just going to be confirming we already have preset the secondary flow to be executed. I can then come in and execute this flow as well. [silence] And you can see if there are any errors, just like any other integration within integrator.io, there are errors. We will generate the error message that can then be viewed. And then you can take the appropriate steps to complete that accordingly. So that’s how we can work with. Data warehousing automation around Snowflake. We can also support BigQuery as well, and it’s very similar. So, first, I’m going to come out here. I’m going to go back to our marketplace go to the top right. I can filter accordingly. We can go to BigQuery. We’ll follow the same steps. If it’s the first time I’m coming in here, again, it’s going to ask me for the appropriate information. Now with BigQuery, since it’s of all 2, you would configure your scopes and such and set that. And I have already established my connection, so I can re-use that here. We’re going to define what applications we want to have backed up. So here let’s say we use Zendesk here, and we’ll just leave it the one for now. So I’ll select Done, Submit. It’s then going to ask us for our connection into that endpoint. Again, I’ve already set this up ahead of time, so I’ll use that endpoint. We’ll define what resources we want to have exported. And there, it’s now established our flows. Now here, with our big query environment, I’ve already set it up here. You can see my database, so I’m going to come back and do the same thing I’ve done before. We’ll go in, we’ll configure it accordingly so I can come in and populate the necessary information here. So let me just get this. table name accordingly. We’ll copy that to be used in the next step. [silence] And I will say with this one, I did it a bit backwards on this one. And the reason I say that is with this integration, under the settings, I can actually define my Project ID and database here under settings. And it would have automatically populated those fields for me within the integration flows. So I could have set that there, had it defined, and it would have been exported for me. But then you can say same thing with the Snowflake integration, I can come out here, I can enable the flows. Now, if I come back to my big query environment, expand my database, you’ll see, we’ve automatically created the tables for you. And now we can execute these flows and then have data populated accordingly. [silence] It’s now completed the first flow. It’s automatically executing the second flow. Once this is complete, we’ll be able to go back to our big query. Now that that’s complete, we’ll go over, check our table, preview. Just refresh here. [silence] And there is our data. All right. Are there any questions? Let’s open this up for questions or such. Yes, we have some questions and I’ll be reading them. So first question is, does Celigo provide the warehouse cloud storage? I can answer this one. So in order to use these big query data warehouse business process automation or Snowflake automation, you actually need to have an existing Snowflake or BigQuery account so we don’t provide your data warehouse storage or subscription. Okay, another question is, how long does it take to integrate Snowflake, including training? So that really depends on if you are using one of the preconfigured source applications that Kelly showed you or if you want to create a new source application connection. But either case, so we have documentation that actually walks the user through the steps required to set this up. And also our UI is very intuitive, so it takes you through the necessary steps one by one. So without any technical difficulties, as we experienced today, you should be able to connect One Source application within less than 10 minutes, I would say. Do you agree with that, Kelly? I do. Yes, very much. Barring any technical difficulties, you should be able to get these things integrated and passing data within 15 minutes or less. Okay, so I’ll move to the next question. So what’s the frequency of sinking data from a source application into Snowflake or BigQuery? Can you take this question, Kelly? I can, yes, thank you very much. So our tool for a batch-based flow– we have your scheduler here, which I’m showing. So what I call our preset scheduler’s here. What you can see is you can set it for up to every 15 minutes. So run every 15 minutes. You can specify a start and end time. You can specify the days of the week you want it to be executed. If you do want more control than that, we do have a cron based scheduler, which you can get it set up to every five minutes to pass data. So the short answer is up to every five minutes. Okay, another question, and it’s really related to this one. I see. So is it possible to define real-time event-based triggers for this integration? It is. This is a fully customizable integration. So as I’m setting these up, yes, it is possible to set them up to be real-time events if necessary. Okay, great. Thank you for that answer. And another question is, so what kind of training do you currently have available? And I can take this one. So currently, first of all, we have Celigo University which we launched earlier this year, and we have lots of courses for all the pre-built integrations and also our platform that trains our customers on how to set up integrations, how to configure their integrations. So these are very in detail. And on top of that, we also have our knowledge base and we have a collection of again, very detailed articles describing how everything works and what steps to follow to set up something, how to troubleshoot. So I hope that answers that question. And one last question I have here is, how easy is it for non-technical users to manage and monitor this integration? Kelly, maybe can you talk to that? I can. It’s set up that once this is installed and configured, really, it takes minimal management, so it’s very easy to manage and maintain. In this regard, you can come in, you can modify schedulers if necessary, or maybe if you need to change or modify the data that’s coming out of an endpoint, it’s very easy to open up the application like here with that Salesforce integration. I can modify the query very quickly or if it’s coming out of Shopify or in this use case, you still have the ability to come in here and modify your export and define how it’s coming out. So management and maintenance of the project or the integrations going on in the future are very easy. Okay, great. So let me take back the presenter mode here. And that was it for today, and I hope you enjoyed this presentation. And as I mentioned, we will be sharing a recording of the demonstration with Kelly’s full audio after this. So we are interested in hearing about your data warehousing challenges, also about your other business process automation challenges. So please visit our website at Celigo.com and reach out to us through the chat and we can schedule a call and discuss how we can help you accelerate your digital transformation journey. So thank you so much, everyone, for attending and get in touch with us. Bye.

About The Speaker

Ebru Saglam

Sr. Product Marketing Manager

Ebru has a diverse background with over a decade of combined experience in marketing, technical sales and customer services roles across startups and enterprises. She also has hands-on experience in the e-commerce landscape, she has spent more than 5 years running her DTC multi-channel e-commerce business.

Kelly Izer

Sr. Solutions Consultant

Kelly joined Celigo over 8 years ago. He has over 20 years in the integration industry, with a focus on helping companies capture the benefits of SaaS platforms with iPaaS.

Meet Celigo

Celigo automates your quote-to-cash process with an easy & reusable integration platform-as-a-service (iPaaS), trusted by thousands of eCommerce and SaaS companies worldwide.

Use it now and later to expedite integration work without adding more data silos, specialized technical skillsets or one-off projects.