Software Development Newsletter: Q2 2025

Director Message

 

Welcome to Mitrais Newsletter.

In this issue, we explore how strategic partnerships and innovative technologies can drive meaningful transformation across your organisation. Our featured client story with MediRecords demonstrates the power of sustained collaboration over 13 years, showing how AI-driven healthcare solutions can address critical industry challenges like clinician burnout whilst enhancing patient care efficiency. This partnership exemplifies our commitment to long-term value creation alongside our clients.

As data becomes increasingly central to business strategy, our thought leadership piece examines prompt engineering as an emerging competitive advantage. Organisations mastering this skill are already seeing faster insights, reduced operational overhead, and more strategic decision-making across retail, healthcare, finance, and logistics sectors.

Our recent achievement of ISTQB Platinum Partner status reinforces the quality assurance excellence we bring to every engagement. This recognition ensures our clients benefit from certified expertise and structured methodologies that minimise risk, reduce costs, and accelerate time-to-market—enabling confident innovation in today’s competitive landscape.

The technical exploration of modern data pipelines addresses a fundamental challenge many organisations face: efficiently transforming raw data into actionable intelligence at scale. Cloud-native, serverless architectures are now essential infrastructure for achieving the scalability, cost-effectiveness, and competitive agility your business demands.

Finally, our featured employee Anang Prihatanto represents the calibre of talent and dedication behind our success. His journey from curious student to industry leader embodies the continuous learning and innovation mindset we bring to every client partnership.

We look forward to continuing this journey of innovation together and welcome new partnerships that share our vision for technological excellence.

Pioneering the Future of Digital Healthcare: The Mitrais and MediRecords Partnership

 

In a world where healthcare demands are constantly expanding and ageing populations call for smarter solutions, MediRecords, an Australian medtech leader in cloud-based electronic health records (EHR) and patient management systems, has partnered with Mitrais for over 13 years to enhance software development and drive digital healthcare innovation.

Combining MediRecords’ cutting-edge AI-driven tools, which streamline patient records and alleviate clinician burnout, with Mitrais’ expertise in software development, this collaboration is poised to redefine the landscape of digital healthcare.

In an exclusive interview, Rob Mills from Mitrais discussed with Fei Teng, CTO of MediRecords, and Matthew Galetto, CEO of MediRecords, their partnership’s impact on healthcare innovation, its challenges, and future goals. Their collaboration exemplifies innovation and teamwork shaping smarter healthcare solutions.

But this isn’t just about technology—it’s about building a brighter, healthier future. Mitrais and MediRecords are not merely addressing the needs of today; they are paving the way for tomorrow, where healthcare efficiency meets technological brilliance.

Click here to watch the video and discover how they revolutionise healthcare for the better!

Why Prompt Engineering Will Become a Core Skill for Data Analysts

 

A few years ago, data analysts primarily focused on mastering SQL to manage and analyse data. However, the rise of AI-driven tools, particularly large language models (LLMs), is shifting this focus. Today, analysts are increasingly leveraging AI to generate insights from data through natural language prompts.

This shift is driven by several factors, including the rapid advancements in NLP technology and the growing need for faster, more efficient insights.

According to a 2025 report by McKinsey, businesses using AI-driven analysis have seen a 20% increase in speed-to-insight, underlining the importance of integrating AI tools into the analyst’s workflow. As a result, prompt engineering, the skill of crafting effective prompts for AI models is becoming a key competence for data analysts. In this article, we’ll walk through what prompt engineering really is, why it matters, how data analysts are using it, and what decision-makers can do to stay ahead.

What Is Prompt Engineering?

Prompt engineering sounds like another tech buzzword, but it’s a simple, practical skill. At its core, it’s about writing clear and effective instructions (called “prompts”) that guide an AI model like ChatGPT or other Large Language Models (LLMs) to produce a useful result.

Think of it like giving instructions to a super-smart intern who takes things very literally. The more clearly you explain what you want including format, tone, constraints, and context, the better the result.

For example, if you ask ChatGPT:

“Summarise this dataset.”

You might get a generic paragraph.

But if your prompt is like this:

“Summarise this dataset for a finance executive, highlight top 3 anomalies in revenue by region, and recommend actions in bullet points.”

Now the result is specific, useful, and actionable.

That’s a good start, but prompt engineering doesn’t stop there.

There is more advanced technique in Prompt Engineering such as:

  • Zero-shot reasoning (solving problems without examples)
  • Few-shot prompting (where you provide examples in the prompt)
  • Chain-of-thought prompting (which guides the AI to reason step by step)

This allows teams to get even more precise, reliable, and context-aware responses by crafting clear and specific prompt guides for AI models to generate targeted outputs. According to Tom’s Guide, detailed prompts ensure that the AI understands the context and provides more accurate results.

Like any form of engineering, prompt engineering involves designing, testing, and iterating. It requires a deep understanding of both the data and the problem space.  So, it’s not just about asking better questions but also building robust prompting systems that can scale across use cases.

That’s prompt engineering and it’s unlocking a new way for teams to work with data and AI.

Why It Matters for Data Analysts

Data professionals, especially data analysts, already have a lot on their plate. They’re juggling dashboards, ad-hoc requests, and stakeholder reports.

According to Google Cloud, prompt engineering offers a way to make this workload more manageable by helping analysts leverage AI tools effectively to achieve the best results, while AI tools can automate tasks.

Here’s how prompt engineering helps:

  • Faster insights – By crafting clear and precise prompts, analysts can guide AI tools to generate quicker, more relevant insights, reducing the time spent on manual analysis.
  • Less manual work – With effective prompting, analysts can instruct AI tools to automate tasks like data cleaning, query generation, and summarising reports, streamlining the workflow.
  • Smarter decisions – Prompt engineering allows analysts to communicate specific needs to AI tools, helping them highlight patterns or generate hypotheses that may have been overlooked, leading to better decision-making.

Prompt engineering isn’t a replacement for SQL, Python, or R; it’s a new layer that complements them. It’s how your team can bridge the gap between raw data and real business value, especially in high-speed environments.

As AI tools like large language models (LLMs) become more integrated into data workflows, platforms are evolving to make these tools more accessible.

For instance, platforms like Databricks are now incorporating LLMs into their ecosystem. This integration enables data analysts to combine scalable data processing with natural language interfaces, making prompt engineering even more essential for those working with unified data, analytics, and AI workloads.

Real Business Use Cases for Data Analysts Across Industries

Data analysts play a crucial role in a variety of industries, helping teams make informed decisions based on data. Here are some practical examples of how data analysts are using prompt engineering in different sectors:

  • Retail: A data analyst uses AI tools to analyse customer purchase data and generate insights on shopping trends, helping marketing teams tailor campaigns to customer preferences.
  • Healthcare: A healthcare analyst leverages AI models to examine patient records, identify trends in treatment outcomes, and recommend improvements in patient care.
  • Finance: Financial analysts use prompt engineering to quickly analyse financial statements, detecting anomalies and generating insights to support investment decisions.
  • Logistics: Logistics analysts use AI to analyse delivery data, identify patterns in shipping delays, and suggest strategies to optimise delivery routes.

These examples show how data analysts are applying AI tools to accelerate their work, enabling better, faster decision-making in various fields. In Data Ideology, integrating AI into data analytics enhances decision-making by automating processes, predicting trends, and providing deeper insights.

What Should Data Analysts and Decision Makers Do Now?

As highlighted by Dataquest, for data analysts, adapting to the rise of prompt engineering is key to staying ahead in an increasingly AI-driven landscape. Here are some practical steps for data analysts to build their expertise in this area:

  • Master prompt engineering – Develop a deep understanding of how to craft precise and effective prompts to get the best results from AI tools like LLMs.
  • Experiment with AI tools – Explore different LLM platforms (e.g., ChatGPT, Claude, Gemini) and experiment with various ways to use them in data analysis workflows.
  • Collaborate with teams – Share your insights and effective prompting strategies with your team. Collaboration is key in building a collective knowledge base.

Now, if you’re a decision maker overseeing a data team or leading digital transformation, it’s important to support your analysts in mastering prompt engineering. Here’s how you can help:

  • Raise awareness – Start conversations about the importance of prompt engineering and how it enhances your team’s capabilities.
  • Encourage experimentation – Provide your analysts or data scientists with time and space to explore LLM tools and experiment with how these models can improve their workflows.
  • Offer training – Consider running workshops or internal demos to teach the team how to craft better prompts. Prompt engineering is a skill that can be learned.
  • Build prompt libraries – Start creating a repository of useful prompts for recurring tasks like reporting, analysis, and summarising customer feedback.
  • Hire for it – In some cases, you may want to bring in roles that blend expertise in data and prompt engineering, especially for customer-facing or insight-heavy functions.

Research by NetSuite shows that AI can help automate tasks that ordinarily consume valuable time of data analysts, enabling businesses to obtain insights more quickly and efficiently, by supporting your data analysts in these ways, you help ensure that your team is equipped to leverage AI tools effectively, driving smarter and faster decision-making across the organisation.

Final Thoughts: Talking to AI Is the New Literacy

We’ve entered a world where data analysts are no longer just writing queries and coding algorithms; they’re conversing with AI to unlock deeper insights.

Prompt engineering is quickly becoming a new skill for data analysts. Those who master it will work faster, deliver clearer insights, and help their companies make better decisions. The skill isn’t just about using AI; it’s about knowing how to guide them with clear, effective prompts.

If you’re thinking about how to scale your data or analytics team with future-ready skills, Mitrais now offers AI-capable talent through our team augmentation services, ready to help you experiment, deploy, and build with Large Language Models from day one.

Because in the near future, knowing how to “speak AI” might be just as important as knowing how to code.

Reference:

https://www.mckinsey.com/capabilities/quantumblack/our-insights/the-state-of-ai

https://cloud.google.com/discover/what-is-prompt-engineering

https://docs.databricks.com/aws/en/large-language-models/

https://www.dataideology.com/ai-in-data-analytics/

https://www.dataquest.io/blog/top-prompt-engineering-skills-you-need-to-work-with-ai/

https://www.netsuite.com/portal/resource/articles/erp/ai-in-data-analysis.shtml

Raising the Standard in Software Quality

 

Solving Software Testing Challenges with Certified Excellence

Software testing remains one of the most common stumbling blocks in the development process. Whether it’s bugs that surface too late, performance issues that impact user experience, or unexpected delays due to rushed QA cycles, these challenges can have a serious impact on project success. In many cases, testing is treated as a final step rather than an integrated process, leading to last-minute fixes, growing technical debt, and frustrated users. Some teams struggle with a lack of in-house QA expertise, while others find it difficult to maintain consistency as systems scale and evolve. The result is often reactive testing, which increases both risk and cost. For companies aiming to deliver high-performing, secure, and reliable software, a more structured, proactive approach is essential.

That’s why we’re proud to announce a major milestone for Mitrais: we have officially been recognised as an ISTQB Platinum Partner. This prestigious recognition from the International Software Testing Qualifications Board is awarded only to organisations that demonstrate excellence in software testing practices and invest significantly in the training and certification of their QA professionals. It reflects our longstanding commitment to delivering high-quality testing services backed by global standards. As a Platinum Partner, we offer our clients access to a team of ISTQB-certified experts, professionals who are trained to apply structured, proven methodologies in every engagement. This means your projects benefit from consistent, reliable, and internationally recognised testing practices that support your development goals and reduce time-to-market.

With over three decades of experience delivering custom software solutions, Mitrais has built a strong foundation in software testing that goes far beyond identifying bugs. Our approach is designed to help clients catch issues early, ensure performance under real-world conditions, and maintain strong security throughout the development lifecycle. We embed our QA team directly into your project workflow, making testing an integrated, efficient part of your development process, not a bottleneck. From the very beginning, we design the testing process, ensure that all requirements are addressed, and follow a detailed plan to guarantee thorough and systematic coverage. Whether you need functional testing, performance validation, security audits, or test automation, our services are tailored to fit your technology environment and business objectives. And now, with the added recognition of ISTQB Platinum Partnership, we bring even more credibility, capability, and confidence to every project we support.

At Mitrais, we understand that software quality isn’t just about checking boxes, it’s about delivering products that perform reliably and build trust from the very first interaction. If you’re looking to enhance your QA strategy or ensure your next software release meets the highest standards, we’re ready to help. Get in touch with us to learn how our certified testing team can make a difference in your development success.

Building Scalable and Reliable Data Pipelines for Modern Enterprises

 

In today’s data-driven world, modern enterprises depend on scalable and reliable data pipelines to manage, process, and analyse vast amounts of data efficiently. These pipelines form the backbone of data infrastructure, enabling seamless data flow from various sources to destinations such as data warehouses, analytics platforms, or machine learning models. This whitepaper explores the fundamentals of building scalable and reliable data pipelines, the evolution of cloud-based architectures, and best practices for designing pipelines that meet the complex demands of modern enterprises.

A data pipeline refers to the sequence of processes that move data from one system to another. It involves a series of steps executed in a specific order, where the output of one step becomes the input for the next.

Typically, a data pipeline consists of three key components: the source, the data processing steps, and the destination or “sink.” Data may be transformed during transit, and some pipelines focus solely on transforming data within the same system.

As enterprises face increasing volumes and varieties of data, pipelines must be robust enough to handle big data demands. Ensuring no data loss, maintaining data quality, and scaling with business needs are essential. Moreover, pipelines should be flexible to process structured, semi-structured, and unstructured data.

Core Components of Data Pipeline Architecture

 

  1. Data Sources

These include internal and external systems generating business and customer data. Examples range from streaming platforms and analytics tools to point-of-sale systems. Every data point, from transactional records to customer behaviour, can provide valuable insights.

  1. Data Ingestion

The purpose of this stage is to efficiently collect and import data from multiple sources into the pipeline for further process. Data is collected from multiple sources via APIs and ingested either in batches or real-time streams. To avoid overwhelming the pipeline with irrelevant data, data engineers assess the variety, volume, and velocity of incoming data, ensuring only valuable data is ingested. This process can be manual or automated, especially in cloud pipelines.

  1. Data Processing

The purpose of this stage is to transform ingested raw data into a clean, standardised and useable format for further process. Raw data is transformed into a standardised format through normalisation, cleansing, validation, aggregation, and transformation. The goal is to reconcile discrepancies, filter out irrelevant data, and ensure completeness and accuracy to support reliable insights.

  1. Data Storage

The purpose of this stage is to securely store processed data in an accessible format that supports efficient retrieval and scalability. Processed data is stored securely in repositories such as data warehouses or data lakes, chosen based on accessibility, cost, and scalability requirements. Centralised storage enables efficient retrieval for analysis and reporting.

  1. Data Analysis

The purpose of this stage is to extract data so that it can be used to help make a better decision. Data scientists and analysts use advanced SQL, machine learning, and statistical methods to extract patterns, trends, and anomalies. Insights are then visualised through charts, graphs, and reports to guide decision-making.

Types of Data Pipelines

Modern enterprises require different pipeline architectures to address various data processing needs. The following are the primary types of data pipelines used today:

  • Batch Pipelines: Process data in scheduled batches, suitable for large datasets that don’t require immediate processing, such as payroll data transfers.
  • Streaming Pipelines: Process data in real time as it is generated, ideal for use cases like financial market data or social media feeds.
  • Lambda Architecture: Combines batch and streaming processing but can be complex and costly to maintain.
  • Delta Architecture: A modern data architecture that simplifies data ingestion, processing, and storage using Delta Lake technology, reducing maintenance overhead and improving efficiency.

 

Benefits of Scalable and Reliable Data Pipelines

Implementing robust data pipelines delivers significant advantages to organizations. Key benefits include:

  • Improved Data Accuracy: Automated cleaning and standardisation reduce manual errors and data silos, resulting in consistent and reliable datasets.
  • Faster, Cost-Effective Integration: Standardised pipelines accelerate onboarding of new data sources, reducing labour and costs.
  • Flexibility and Scalability: Cloud-native pipelines provide elasticity to handle dynamic data growth and evolving business requirements.
  • Real-Time Data Access: Enables timely insights that improve operational efficiency and customer experiences.
  • Enhanced Data Governance: Integrated policies and audit trails ensure security and regulatory compliance.
  • Better Decision-Making: High-quality, timely data supports more accurate and efficient business decisions.

Building and Managing Data Pipelines: Best Practices

To ensure the effectiveness and reliability of data pipelines, organizations should follow these best practices:

  • Define Clear Goals: Establish objectives and key performance indicators to guide pipeline design.
  • Allocate Resources Wisely: Choose appropriate tools and ensure sufficient technical and budgetary support.
  • Identify Data Sources and Ingestion Methods: Decide between batch or streaming ingestion based on data characteristics.
  • Develop a Robust Processing Strategy: Implement data cleaning, transformation, and enrichment protocols.
  • Choose Strategic Storage Solutions: Balance accessibility, cost, and scalability when selecting storage.
  • Design Efficient Workflows: Map dependencies, error handling, and recovery processes.
  • Implement Monitoring and Governance: Continuously track pipeline health, security, and performance.
  • Ensure Reliable Data Consumption: Enable seamless access through BI tools, APIs, or reporting platforms.

The Importance of Pipeline Management 

As data volumes grow, effective management tools are essential for orchestrating, monitoring, and optimising pipeline workflows. Automation of repeatable tasks and consolidation of siloed systems ensure high-quality, up-to-date data and operational efficiency.

 

Traditionally, data pipelines were deployed on-premises to manage data flow between local systems and tools. However, with the rapid growth in data volume and complexity, cloud data pipelines have become the preferred architecture for modern enterprises.

Cloud data pipelines leverage platforms such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP) to automate data movement and transformation across diverse sources, storage systems, and analytics tools. For example, they facilitate moving data from e-commerce websites and business intelligence software to cloud data warehouses.

Cloud-native pipelines offer resilience, flexibility, and scalability, enabling efficient data processing, real-time analytics, and streamlined integration.

Optimising Pipelines with Serverless Architecture

Serverless architectures, managed by cloud providers, allow companies to focus on data ingestion and transformation without managing infrastructure. Serverless data pipelines solutions provide optimised computing resources, improving throughput and reducing costs compared to traditional pipelines.Leading Tools for Data Pipelines

Several tools and platforms are widely used to build and manage data pipelines. The most notable include:

  • Apache Spark™: A widely adopted open-source engine for large-scale data processing, popular for building custom data pipelines. While powerful and flexible, Spark requires significant expertise to manage cluster infrastructure, optimize performance, and maintain pipeline reliability.
  • Databricks: Built on top of Apache Spark, Databricks offers a fully managed, cloud-native platform that simplifies pipeline development and operations. It automates infrastructure management, provides built-in data quality testing, and supports both batch and streaming pipelines out-of-the-box. This reduces complexity and accelerates time-to-value compared to managing raw Spark clusters, making it a preferred choice for enterprises seeking scalable, reliable, and maintainable data pipelines.

Conclusion

Building scalable and reliable data pipelines is essential for modern enterprises striving to harness the full power of their data. By leveraging cloud-native architectures and advanced platforms like Databricks, organisations can ensure their data pipelines are flexible, efficient, and secure-capable of handling increasing data volumes and complexity while delivering timely, accurate insights.

Mitrais, as a trusted Databricks partner, offers deep expertise in designing, implementing, and managing data pipelines tailored to your enterprise’s unique needs. Contact us today to unlock the full potential of your data infrastructure and accelerate your data-driven transformation.

References

The Power of Self-Improvement - Anang Prihatanto

 

From the academic halls of the prestigious Institut Teknologi Bandung (ITB) to the dynamic tech landscape of Mitrais, Anang Prihatanto has carved a path fuelled by passion, curiosity, and an unwavering commitment to growth.

Anang’s fascination with software development began in his university years, where he honed his skills in Informatics Engineering. His ability to craft solutions and simplify complexities set the foundation for an exciting career. After earning his degree, he took his first steps into the professional world as a programmer at a train signalling company in Bandung. While rewarding, the role lacked the vibrancy he craved. When a friend mentioned a renowned software company in Bali, Anang saw an opportunity—an exciting blend of technology and island life. With enthusiasm, he made the leap.

That leap was not just a career move—it was a personal transformation. Joining Mitrais in 2008 felt like discovering a second family—a place where innovation thrived, challenges spurred growth, and colleagues became lifelong friends. Over the years, he faced technical puzzles, client expectations, and the delicate balance between professional and personal life. But, as Anang believes, every challenge is a lesson in disguise. Whether fine-tuning his ability to decode client needs or learning the art of communication within a diverse team, each experience helped shape him into a well-rounded individual.

Beyond technical mastery, Anang’s personal growth was evident in his ability to embrace change. He pushed himself beyond his comfort zone, earning certification as a Microsoft Certified Solutions Associate (MCSA), refining his skills in Microsoft Azure, and continuously expanding his technical repertoire. In addition, the range of Mitrais internal certifications grew his skillset even further. As he says, I found Mitrais’ Analyst Certification Program (ANCP) particularly useful to become a more mature developer, with its emphasis on software engineering disciplines such as requirements gathering, analysis, design, and configuration management. More importantly, he developed resilience—learning to navigate setbacks, balance responsibilities as a husband and father, and master the art of switching off work mode when needed. His growing ability to compartmentalise challenges gave him clarity, ensuring he remained present in both his professional and personal life.

His dedication has not gone unnoticed. Recently, he stepped into a new role as Acting Technical Evangelist, a position afforded only to our most experienced engineers, which allows him to help guide Mitrais’ software engineers in advancing their skills and continuing Mitrais’ outstanding record of delivering quality to our partners and customers. The transition marked an exciting challenge—not just in technical mentorship, but in leadership, adaptability, and fostering an environment that inspires growth in others.

For Anang, Mitrais is not just a workplace—it’s an ecosystem that has nurtured his evolution, both professionally and personally. His journey—from an enthusiastic student to an inspiring mentor—reflects the power of self-improvement, perseverance, and continuous learning. And as he sets his sights on new certifications and professional milestones, one thing is certain: his transformation is far from over.