P&C Insurance AnalyticsAn Analytics Platform for P&C Insurance giants to help optimize insurance strategies.

Problem Statement
The client is a US based company which develops platforms for property & casualty insurance companies, brokers, agents and construction companies who aim to make their insurance & administration processes paperless and efficient.
With programs & projects worth over 100 billion USD being executed, the administrators at times lose visibility of process blockages due to the amount of data being generated. Identifying the on-site hazards, fraudulent insurance claims, efficient insurance packages (Workers Compensation & General Liability) becomes cumbersome due to complex calculations involved. With projects being executed at gigantic scale even a fraction of a brokerage leads to million dollar savings.
Challenges
When we had conversations with the business partners and IT teams and performed an audit we found the following:
- A lot of clients were migrated from other vendors hence the data migration process led to a lot of buggy data in the system which required some thoughtful cleaning.
- The source database had lot of issues which were identified in the first phase of implementation.
- Due to nature of domain, the accuracy and precision that was expected was very high even in case of bogus data.
- They were storing each and every type of data into SQL Server making it considerably slow. Even dynamic data which was in json format was stored in SQL Server.
- They were not much aware of analytics until the clients started asking for it. It was important for them to convey the ROI on the investments in analytics & data engineering.
- The marketing and sales teams were very aggressive and got a lot of leads and demos, but the data layer in the products was poorly designed which questioned the scalability of the system.
Solution Design & Implementation

We implemented this project from end-to-end including optimizing source databases, implementing data engineering pipelines, designing data warehouses and enterprise analytics using Power BI. Following were the salient features of the architecture that we designed and implemented for them:
- Data Lake Architecture : The senior leadership was confident of data growing at 3x pace in the next 1 year leading to high data volume along with the semi-structured data which needed to be stored & analyzed along with relational database.
- ELT with Snowpark : Implemented Extract, Load & Transform (ELT) on raw data by loading it in Snowflake on top of the Data Lake created in AWS S3. We used Snowpark for transformations to keep data inside the snowflake ecosystem.
- Cloud Based Architecture, Easy Access, Saving Costs : Owing to the data scale and variety we essentially implemented a Big Data Architecture on Cloud allowing business users to access the data easily and also saving costs during non operational hours and quiet business periods.
0
Lesser Contract
Closeout Time
0
Paperwork
Eliminated
0
Reduction in
Insurance Frauds
0
Increased
Sales Leads
Future Plans
Data Lake Expansion
Load & process data from other applications into central data lake to provide 360 degree view analytics.
Central Analytics
Convert all static reports running on OLTP to analytical dashboards to run on the Enterprise Datawarehouse.
Analytics on SaaS
Sell Analytics as a product to clients since the dashboards uncovered business problems that clients were unaware of.
Data Governance
Implement Data Governance policies, data standardization practices for compliances, security and auditing purposes.