PowerBI Dashboards Developers
Client wanted a complicated powerBI dashboard, with many different types of power BI visualizations. The input data was quite raw too. Our powerBI developers team created the complicated controls and the dashboard.
Client wanted a complicated powerBI dashboard, with many different types of power BI visualizations. The input data was quite raw too. Our powerBI developers team created the complicated controls and the dashboard.
The data in the source files were not of a specific schema. So the aws glue crawlers needed to use the custom classifiers created by our aws glue developer team. Complicated grok regex patterns were used.
Our team designed the aws redshift database cluster. The distribution key and sort key were chosen based on the end use case of the client. Redshift clusters were designed so that all of the nodes were used optimally.
Our developer created an aws lambda nodejs with headless browser phantomjs to web scrape/spider live data from a website. The scraped data was written to aws rds mysql instance. Cloudwatch triggered lambda.
Our tableau developers created useful dynamic visualizations using raw CSV file data provided by the customer. The tableau visualizations helped the customer to analyze the data in a much more informed manner.
Our pyspark developers created complex transformations and used AWS GLUE to transfer million row CSV file data to AWS Redshift data warehouse. Complex nested json data within CSV.
Our client wanted to import very large sized CSV JSON dataset into snowflake db. We took up this project and administered snowflake and created all the necessary ETL scripts.
Our client wanted to connect their farm of IOT devices to aws iot core. Our iot developer integrated the massive number of iot devices to aws iot core and subsequently wrote the data to aws dynamodb.
A small subset of data was needed to be transferred to postgresql from aws S3 on a daily basis. Our AWS lambda python developer created the lambda scripts triggered by cloudwatch. Postgresql db was administered.
Our kubernetes engineer helped convert a legacy application to a multi tenant SaaS application then deployed it using Kubernetes. The containers were deployed using google kubernetes engine GKE.
The client wanted to transfer data from google cloud storage to Google bigquery transforming the data in the process, our google data engineers completed the ETL process with ease using google dataflow.
A very high performance rest api gateway server was created using aws api gateway with lambda endpoints which further get/put data using aws rds mysql by our python lambda developers.
The client needed custom alexa skillset to be created. Our alexa developers created the skillset for controlling specified devices. The alexa skillset had lambda endpoint which retrieved data from a 3rd party API.
Our customer wanted their video content and assets (both static and dynamic) to be available with minimum latency throughout the world. Our cloudfront developers configured and setup aws cloudfront CDN w SSL.
We created a complex aws cloudformation YAML template for our client involving creation of VPC, Private Public Subnets, Route tables, NAT gateway, Internet gateway, Network load balancers, Application load balancers
Our pivotal cloud foundry developers created necessary aws resources namely the required instances, NAT gateway, load balancers etc, helped maintain and trouble shoot the installation. PCF spring boot.
Client wanted automatic processing of camera video images streamed through AWS kinesis video streaming and suitable actions taken. Our developers accessed the live stream data using Amazon kinesis video api.
The project was done for a mass spectrometry based startup client. Huge on premise files were processed through a fault tolerant aws pipeline using aws fargate. Docker developer team.
Python Django framework was made to run Serverless. Serverless installation of Django was done by our Serverless Django Developer Team.