Data Forest logo
AWS Cost Reduction case image preview
Home page  / Cases
AWS Cost Reduction

AWS Cost Reduction

This project optimized the cloud infrastructure of a U.S. IT services company to reduce costs and improve performance. Our investigation identified several areas for optimization, including unused computing resources, inconsistent storage, and a lack of savings plans. We helped to optimize resources, implemented better policies for storage, and improved internal traffic flow through architecture redesigns and dockerization.

23k+

monthly savings

8%

performance optimization
AWS Cost Reduction case image preview
Don’t Fall Behind—Outpace Competitors!
Enhance Your Data Accuracy and Reporting. Secure your advantage with real-time updates and predictive insights before others catch up!
Book a call

About the client

U.S. IT Services and Consulting company that protects businesses and their customers from unauthorized account access, and identity fraud.

Tech stack

AWS icon
AWS
AWS EC2 icon
AWS EC2
S3 icon
S3
ECS icon
ECS
Docker icon
Docker
Revolutionize your Demand Forecasting and Inventory Management Now
Transform your sales data with our solution that achieved 88% forecasting accuracy, and reduced stockouts by 0.9%. Don't miss out—secure your advantage today!
Book a Consultation today
Beat the competition with Intelligent Chatbots
Discover how to gather valuable customer data, automate analysis, and gain insights using AI to become a market leader!
Book a Consultation today
Outshine Competitors with the Strategic Benefits of AI
Discover how to gather valuable customer data, automate analysis, and gain insights using AI to become a market leader!
Book a Consultation today

The client's needs

During a period of explosive growth the AWS Cloud infrastructure cost reached 57k per month, this represented an increase of 265% over the last 9 months. A key goal was full cost optimization.

Challenges & solutions

Challenge

To identify areas where costs could be reduced.

Solution

 During the discovery, the team investigated the company's cloud infrastructure, including computing resources, storage, and internal traffic flow, to find potential cost reduction targets. 

Challenge

 Address unused Computing resources:

  • 79% of instances were underused
  • 12% of instances marked as “forgotten”
  • “Master” instances were continually running instances after CI/CD
  • No limitations for all company employees when creating instances

Solution

Computing resources:

  • Optimized instance capacity
  • Cleaned forgotten instances and resources 
  • Scheduling non-production resources to stop during off-hours
  • Optimized CI/CD to stop “Master” instances in LB groups
  • Provided security audit configured with valid permissions for users and groups

Challenge

Volumes/Snapshots/AMI:

  • 31% of volumes got usage below 10%
  • 136 TB of volumes are not attached 
  • No consistency in the storage type

Solution

Volumes/Snapshots/AMI:

  • Resized underused storage
  • Backup data to S3 and removed unused volumes
  • Optimized storage type, downgrade IOPS

Challenge

S3:

  • No retention policy, data was stored forever 
  • Unused data, a lot of tests, and duplicates
  • No deep storage for old data (Glasier)

Solution

S3:

  • Retention policy created 
  • Cleaned unused data and files 
  • Provided Glaser storage

Challenge

Saving plans:

  • No unified instance families 
  • Use of old instance families
  • No saving plans enabled

Solution

 Saving plans:

  • Upgraded all instances to the current generation
  • Chose instance types appropriate for the load
  • Added reserved instances for stable parts of the infrastructure 
  • Added saving plans for the remaining instances

Challenge

 Internal traffic (architecture redesign) is responsible for 7.5% of costs:

  • Internal services setup as separate EC2 instances
  • Most data was transferred through a chain of at least 5

Solution

Dockerization: 

  • Microservices dockerized
  • Running chains of services on the same instance not only fully eliminated data transfer but also increased final performance by 8%.

Challenge

Solution

Challenge

Solution

Results

The results were impressive, with 23.5k in monthly savings, a 67.5% reduction in instance costs, 916 TB of unused storage removed, and an 8% speed improvement. Additionally, the company operates a more secure and robust infrastructure that can handle a doubling of clients with less than 16% cost growth.

AWS Cost Reduction first slider image
AWS Cost Reduction first slider image
AWS Cost Reduction first slider image
AWS Cost Reduction first slider image
AWS Cost Reduction first slider image
Stay one step ahead with global AI technologies
Transform how you process & analyze data, and deliver insights with market leading custom solutions from Dataforest.
Book a call
gradient quote marks

The team's deep understanding of our needs allowed us to achieve a more secure, robust, and faster infrastructure that can handle growth without incurring exorbitant costs.

Harris N. photo

Harris N.

CTO IT Services & Consulting

Steps of providing
data scraping services

Consultation icon

Step 1 of 5

Free consultation

It's a good time to get info about each other, share values and discuss your project in detail. We will advise you on a solution and try to help to understand if we are a perfect match for you.
Analysis icon

Step 2 of 5

Discovering and feasibility analysis

One of our core values is flexibility, hence we work with either one page high level requirements or with a full pack of tech docs.  At this stage, we need to ensure that we understand the full scope of the project. Receive from you or perform a set of interviews and prepare the following documents: list of features with detailed description and acceptance criteria; list of fields that need to be scraped, solution architecture. Ultimately we make a project plan which we strictly follow. We are a result-oriented company, and that is one of our core values as well.
Solutions icon

Step 3 of 5

Solution development

At this stage, we develop the scraping engine core logic. We run multiple tests to ensure that the solution is working properly. We map the fields and run the scraping. While scraping, we keep the full raw data so the final model can be enlarged easily. Ultimately we store data in any database and run quality assurance tests.
Data delivery icon

Step 4 of 5

Data delivery

After quality assurance tests are completed, we deliver data and solutions to the client. Though we have over 15 years of expertise in data engineering, we expect client’s participation in the project. While developing and crawling data, we provide midterm results so you can always see where we are and provide us with feedback. By the way, a high-level of communication is also our core value.
Support improvement icon

Step 5 of 5

Support and continuous improvement

We understand how crucial the solutions that we code for our clients are! Our goal is to build long-term relations, so we provide guarantees and support agreements. What is more, we are always happy to assist with further developments and statistics show that for us, 97% of our clients return to us with new projects.

How we provide data integration solutions

Consultation icon

Step 1 of 5

Free consultation

It's a good time to get info about each other, share values and discuss your project in detail. We will advise you on a solution and try to help to understand if we are a perfect match for you.
Analysis icon

Step 2 of 5

Discovering and feasibility analysis

One of our core values is flexibility, hence we work with either one page high level requirements or with a full pack of tech docs.  

At this stage, we need to ensure that we understand the full scope of the project. We receive from you or perform a set of interviews and prepare the following documents: integration pipeline (which data we should get and where to upload), process logic (how system should work); use cases and acceptance criteria; solution architecture. Ultimately we make a project plan which we strictly follow.
Solutions icon

Step 3 of 5

Solution development

At this stage, we build ETL pipelines and necessary APIs to automate the process. We attract our DevOps team to build the most efficient and scalable solution. Ending up with unit tests and quality assurance tests to ensure that the solution is working properly. Focus on Results is one of our core values as well.
Data delivery icon

Step 4 of 5

Solution delivery

After quality assurance tests are completed, we deliver solutions to the client. Though we have over 15 years of expertise in data engineering, we are expecting client’s participation in the project. While developing the integration system, we provide midterm results so you can always see where we are and provide us with feedback. By the way, a high-level of communication is also our core value.
Support improvement icon

Step 5 of 5

Support and continuous improvement

We understand how crucial the solutions that we code for our clients are! Our goal is to build long-term relations, so we provide guarantees and support agreements. What is more, we are always happy to assist with further developments and statistics show that for us, 97% of our clients return to us with new projects.

Steps of providing web applications services

Consultation icon

Step 1 of 7

Web development discovery

In the initial stage of the web-based development project, professional business analysts make detailed documentation of the project requirements and the approximate structure of the future web application. DATAFOREST is a custom web application development agency, guided by extensive experience in multiple industries. We give you detailed project documentation and then assemble the team according to your time and budget.
Analysis icon

Step 2 of 7

UX and UI design

Based on your wishes, the needs of your target audience, and the best web application design and development practices, our UX and UI experts create an aesthetically pleasing and user-friendly interface for your app to satisfy even the most demanding users.
Solutions icon

Step 3 of 7

Web-based application development

At DATAFOREST we are following the best programming design principles and approaches. Being a data engineering company, we build high load platforms, with a significant level of flexibility and result orientation. We keep our deadlines and follow SOC 2 compliance requirements.
Data delivery icon

Step 4 of 7

Integration

With DATAFOREST, integrating the application into your business won’t stop your processes for a minute. We provide seamless integration with your software infrastructure and ensure smooth operation in no time.
Quality assurance icon

Step 5 of 7

Quality assurance

We use a multi-level quality assurance system to avoid any unforeseen issues. Working with DATAFOREST, you can be confident that your web app development service solutions arrive to the market polished and in full compliance with all certification requirements.
Support improvement icon

Step 6 of 7

24/7 support

Once a product is released to the market, it’s crucial to keep it running smoothly. That’s why our experts provide several models of post-release support to ensure application uptime and stable workflows, increasing user satisfaction.
Web app improvement icon

Step 7 of 7

Web app continuous improvement

Every truly high-quality software product has to constantly evolve to keep up with the times. We understand this, and therefore we provide services for updating and refining our software, as well as introducing new features to meet the growing needs of your business and target audience.

The way we deal with your task and help achieve results

Consultation icon

Step 1 of 5

Free consultation

It's a good time to get info about each other, share values and discuss your project in detail. We will advise you on a solution and try to help to understand if we are a perfect match for you.
Analysis icon

Step 2 of 5

Discovering and feasibility analysis

One of our core values is flexibility, hence we work with either one page high level requirements or with a full pack of tech docs.  

In Data Science, there are numerous models and approaches, so at this stage we perform a set of interviews in order to define project objectives. We elaborate and discuss a set of hypotheses and assumptions. We create solution architecture, a project plan, and a list of insights or features that we have to achieve.
Solutions icon

Step 3 of 5

Solution development

The work starts with data gathering, data cleaning and analysis. Feature engineering helps to determine your target variable and build several models for the initial review. Further modeling requires validating results and selecting models for the further development. Ultimately, we interpret the results. Nevertheless, data modeling is about a process that requires lots of back and forth iterations. We are result focused, as it’s one of our core values as well.
Data delivery icon

Step 4 of 5

Solution delivery

Data Science solutions can be a list of insights or a variety of different models that consume data and return results. Though we have over 15 years of expertise in data engineering, we expect client’s participation in the project.  While modeling, we provide midterm results so you can always see where we are and provide us with feedback. By the way, a high-level of communication is also our core value.
Support improvement icon

Step 5 of 5

Support and continuous improvement

We understand how crucial the solutions that we code for our clients are! Our goal is to build long-term relations, so we provide guarantees and support agreements. What is more, we are always happy to assist with further developments and statistics show that for us, 97% of our clients return to us with new projects.

The way we deal with your issue and achieve result

Consultation icon

Free consultation

Step 1 of 5

It's a good time to get info about each other, share values and discuss your project in detail. We will advise you on a solution and try to help to understand if we are a perfect match for you.
Analysis icon

Step 2 of 5

Discovering and feasibility analysis

One of our core values is flexibility, hence we work with either one page high level requirements or with a full pack of tech docs.  

Depending on project objectives, DevOps activity requires auditing the current approach, running metrics measurement, performing monitoring and checking logs. By having a set of interviews, we ensure that we understand the full scope of the project. Ultimately we make a project plan which we strictly follow. We are a result-oriented DevOps service provider company, and that is one of our core values as well.
Solutions icon

Step 3 of 5

Solution development

At this stage, our certified DevOps engineers refine the product backlog. We deliver great results within digital transformation, cost optimization, CI/CD setup, containerization, and, last but not least, monitoring and logging. We are a result focused company – it’s one of our core values.
Data delivery icon

Step 4 of 5

Solution delivery

After quality assurance tests are completed, we deliver solutions to the client. Though we have over 15 years of expertise in data engineering, we expect client’s participation in the project. By the way, a high-level of communication is also our core value.
Support improvement icon

Step 5 of 5

Support and continuous improvement

We understand how crucial the solutions that we code for our clients are! Our goal is to build long-term relations, so we provide guarantees and support agreements. What is more, we are always happy to assist with further developments and statistics show that for us, 97% of our clients return to us with new projects.

Success stories

Check out a few case studies that show why DATAFOREST will meet your business needs.

Infrastructure Audit & Intelligent Notifications

An e-commerce company had issues with managing its complex IT infrastructure across multiple cloud providers. We helped to analyze the current architecture and develop a strategy for unification, scaling, monitoring, and notifications. As a result, we implemented a single cloud provider, CI/CD process, server unification, security and vulnerability mitigation actions, and improved reaction speed and reliability by 200%.
200%

performance boost

24/7

monitoring

Dean Schapiro photo

Dean Schapiro

Co-Founder, CTO Ecom Innovators, E-commerce company
View case study
Infrastructure audit case image
gradient quote marks

Not only are they experts in their domains, but they are also provide perfect outcomes.

Performance Optimization & Bottlenecks Elimination

The financial services company faced performance issues and bottlenecks on its financial platform and was looking for a DevOps partner to increase application performance, stability, and resilience, and reduce operational costs. DATAFOREST performed a technical audit of the current AWS infrastructure, created a bottleneck monitoring system, re-developed inefficient SQL queries and data pipelines, and implemented horizontal scaling and a microservice approach using Docker and Kubernetes.
1000%

performance boost

20%

cost optimization

Daniel Garner photo

Daniel Garner

CTO Flexium, FinTech company
View case study
Performance Optimization & Bottlenecks Elimination preview
gradient quote marks

The team of DATAFOREST is very skilled and equipped with high knowledge.

DevOps Experience

The ML startup faced high costs during its growth for a data-driven platform infrastructure that processes around 30 TB per month and stores raw data for 12 months on AWS. We reduced the monthly cost from $75,000 to $22,000 and achieved 30% performance over SLA.
2k+

QPS performance

70%

cost reduction

Robert P. photo

Robert P.

CTO Cybersecurity
View case study
DevOps Experience case image
gradient quote marks

They have very intelligent people on their team — people that I would gladly hire and pay for myself.

Financial Intermediation Platform

The project aims to develop a deal origination platform for private equity investments in infrastructure-related sectors and involves building a secure, interactive B2B platform from scratch, with sign-up functionality to connect investment firms to proprietary investment opportunities. DATAFOREST built a highly-loaded platform and applied AI functionality to empower the application's development.
98%

model accuracy

100%

timely development

Enrico Cattabiani photo

Enrico Cattabiani

Founder & CEO IDN, Infrastructure Deals Network
View case study
Financial Intermediation Platform preview
gradient quote marks

They understood our requirements, translated into actions rapidly, and adapted to requests easily.

Don’t Miss the Chance to Leverage Global AI Trends for Your Business Growth
Streamline Your Business Operations and Automate Insights and Analysis with Cutting-Edge AI Technologies. Discover how with Dataforest!
Book a call
Dominate the Market with Advanced Communication Efficiency Using AI
Lead the Curve with Advanced AI Agents and Smart Chatbots! Don’t Wait—Improve Your Communication Strategy and Get Ahead While Others Catch Up
Book a call
Activate Your Data’s Power
Get in touch today to discover how we can help you use your data to surpass the competition.
Book a call

Latest publications

All publications
Article preview
December 3, 2024
7 min

Mastering the Digital Transformation Journey: Essential Steps for Success

Article preview
December 3, 2024
7 min

Winning the Digital Race: Overcoming Obstacles for Sustainable Growth

Article preview
December 2, 2024
12 min

What Are the Benefits of Digital Transformation?

All publications

We’d love to hear from you

Share the project details – like scope, mockups, or business challenges. We will carefully check and get back to you with the next steps.

DATAFOREST worker
DataForest, Head of Sales Department
DataForest worker
DataForest company founder
top arrow icon