backArrow_img

Customer Data Management platform

blogbig_img

During our recent exploration of Customer Data Management platformSegmentbyTwilio, we were astonished by the data federation methods they Offer. WhatZapierandAuth0did for connectors and Authorization layer,Segmentis trying to accomplish for the Customer Data Flow Pipeline.

A very useful feature of the platform is the Functions. After testing the platform, we wanted the more users to understand the platform by illustrating how we managed data flow before Segment usingMicrosoft Azure Cloudwith the help of#azurefunctionsandSnowflake

An ETL pipeline between Azure SQL Cloud and SnowFlake Datawarehouse was never a straightforward task.

1. Setting up a Timer Trigger Azure Function
2. Group Functions based on the Data that needs to be processed
3. Create Functions Orchestrations
4. Create a Framework for Durable Functions for long running processes.
5. Setup Activity Triggers
6. Review using Snowflake visualizer
7. Take care of Dependency Injections for the Azure Functions
8. Spend hours of resources on Monitoring the Functions and the runtime in the Browser dies on you mostly.
Without any of these hassles the Plug and Play architecture of Segment is here to stay.
The difference between Conventional Data Flow and Segment Data Flow is given below.

Segment source :https://segment.com/, Conventional Flow Comparison :TheecodeAlchemy Technologies GlobalKhaleel RahmaanArunKumar RameshZendeskHubSpotFreshsalesSalesforce#customerdataplatform#customerdata#gdpr#web#mobileapp#api#etl#azure#aws#sqlserver#cloud#azurefunctions#awslambda#bi#salesforce#hubspot#zendesk#freshsales#data#architecture#datamanagement#pipeline#sql#microsoft#segment#twilio#saas#fintech#datawarehouse