If you’re thinking about the reporting capabilities in your hospitality business, there’s a big chance you’re considering building your own bespoke reporting suite as you move away from the tedium of Excel. In a perfect world, all of your different systems would come together and layer seamlessly on top of each other so you could pull the insights you need at the click of a button.
We asked our Director of Engineering at Tenzo, someone with a lot of experience building these kinds of suites, exactly what he would do if tasked with achieving this promised land.
Typical challenges we see with bespoke reporting suites
The key is to be prepared. Have a plan of action before you start building so you can avoid the typical challenges we see hospitality businesses run into when starting this project.
“Too-high-level, static, and lack of granularity”
BI projects often originate from an initial desire to have a “simple KPI dashboard” or to just “automate reporting”. These basic requirements (e.g., Sales, ATV, Labour Cost, Reviews, NPS,…) don’t require granular integrations and a comprehensive data schema.
But later, you hear grumblings that “we’re overstaffed and losing money… staff are finding the shifts stressful… yesterday’s labour was low but customers were frustrated by the wait”.
Unfortunately, you can’t delve into these issues because you need granular data from different systems overlaid on one table/chart. Things like:
- hourly sales data
- hourly labour data
- employee feedback/NPS by shift
- customer feedback.
So you go back to living in Excel and downloading CSVs!
“No one uses it”
Some teams plan to have one or two finance members trained on these tools to handle all reporting and ad-hoc requests. But, General Managers, Area Managers, and Marketing often just stick to Excel because they don’t know what to ask for until they play around with the data or they want the answer right now, not next week.
Another problem is if the trained users leave the business. When that happens and no one knows how to use the BI suite or what the underlying tables mean, you end up with obsolete tech and back in Excel again.
“We don’t trust the data”
Integrations are tough and they will break as APIs change or suppliers have downtime. When these issues aren’t caught and corrected quickly, people will just stop using them… and go back to Excel!
“Too late and too slow”
Often data warehouses and reports aren’t designed for quick and real-time data loading because it’s complex and adds costs that aren’t necessary for a Finance KPI Dashboard project. This means that most data only refreshes daily and can’t be used for intra-day decision-making.
Key considerations when approaching the project
Considering the above challenges, the key themes to consider when approaching this project are data access, data storage, and data visualisation.
Data Access
In any reporting tool, the first step is usually data access. For hospitality businesses, data comes from several sources — Point of Sale (POS) systems, labour management tools, inventory management, customer feedback, and more. Each source presents unique challenges, such as proprietary APIs, data availability, and security protocols that can slow down integration.
When building your own system, it’s easy to underestimate the ongoing maintenance involved in keeping APIs, custom connectors, and database syncs up and running. API-based integrations are the easiest, but APIs still change: they are updated or deprecated and require consistent revisions.
In some cases, systems don’t have modern APIs or have limited APIs that don’t fit your needs. For systems that are not API-based—for example, a legacy POS system that only offers manual CSV exports or requires direct database exports—building a custom solution introduces even more complexity:
You may need to develop custom ETL pipelines (Extract, Transform, Load) to manage these formats and make regular data imports, which is both time-consuming and fragile. Non-standard integrations with systems that rely on manual data dumps, web scraping, or database queries will require dedicated developer hours just to create, test, and maintain these connectors. Real-time data becomes even more difficult to achieve. With non-API systems, it may be impossible to get live data refreshes—you’d instead depend on periodic data dumps, adding to data latency issues.
For every integration or custom connector, there’s a maintenance cost associated with changes to the external platform. As APIs evolve, custom solutions may require constant updating to ensure continued access to data. Without predefined support, fixing these issues becomes time-consuming and often interrupts regular business operations.
However, developing your own connectors can give you more custom control, allowing you to build custom integration logic that perfectly fits your business (e.g., merging POS and invoice data in a non-standard way).
The key question to ask yourself is whether you have access to the expertise to build and maintain these connectors whether that’s in-house developers or external consultants. If you choose to go the in-house route, integrating services like POSs, Reviews, inventory, labour etc. into a unified reporting structure requires strong API experience and long-term developer support. You also need to have redundancy plans where knowledge can be transferred if you lose team members as untangling what development has been done without any prior knowledge is a mammoth task.
Data storage
Once data is ingested, the next critical component is how it is stored. How you decide to store data has an outsized impact on speed (how fast reports can be generated), scalability (how much your data grows), and long-term costs.
Setting up a dedicated data warehouse (e.g., AWS Redshift, Google BigQuery) gives you more direct control over how data is stored. You’ll create your schemas (how you organise the data) and optimise queries for reporting, giving you more precise control over query time. However, this setup takes weeks to months to implement.
A note on your schemas: you’ll want to think through how you set these up as ultimately they’ll dictate what kind of metrics you report on. For example, if you want to run reports on menu profitability you’ll need to link your sales data to your inventory data in a way that allows for it. Check out Tenzo’s universal schema to see an example of how we do it.
You must also consider the full infrastructure cost of hosting, scaling, and securing your custom data warehouse, along with access controls and setting up redundancies. Over time, as the amount of data scales, the cost of managing and optimising the database will increase.
But, if your reporting needs require complex joins or specific indexing strategies, building your own storage solution can give you an advantage. You can optimise the database for fast queries that are custom-tailored to your exact reporting needs.
It is worth noting though that managing a data warehouse requires strong SQL/NoSQL database expertise, schema design, and experience scaling databases to match the changing load.
Data visualisation
With data securely stored, the next technical challenge is to surface that data in a manner that suits the business. In the hospitality industry, important KPIs like revenue numbers, labour costs, and food waste need to be made accessible via formats like dashboards, reports, and real-time alerts.
Developing custom BI dashboards using systems like Power BI, Looker or Tableau gives you the ability to fully customise the way data is presented—creating a visual representation that fits exactly what you need. However, crafting every report and visualisation takes more development time. These suites are also not intuitive to use, so require prior knowledge or training to use effectively.
Developing custom reports requires initial development costs as well as ongoing engineering involvement. Anytime a new report is needed, or an existing report requires customisation or bugs are found, developers will have to fix it.
However, complete control over how data is displayed means you can design unique visualisations that give you a competitive advantage. For instance, building a custom dashboard to track staff productivity against top-performing menu items might offer deep insights.
Once you’ve considered these three elements, you’re ready to start sketching out your build.
Build your own reporting suite from scratch
1. Define data strategy and identify use cases
When approaching a project like this, one error we often see is starting without a clear idea of what it is the business wants to get out of it. So, before even beginning to think about implementation there should be a clear definition of what the project is and who it’s for.
The first thing to do is prioritise the key data use cases. Are you looking to, for example, increase sales, reduce food waste, optimise labour, or predict customer demand? These should align with overall business goals.
Once you’ve identified your key use cases, you should separate them into categories eg operations use cases: optimise staff schedules, reduce food waste; finance use cases: managing profitability.
2. Identify user personas
Who is going to be using these reports you eventually build? Each persona will likely need a different level of accessibility and complexity. For example, are you expecting your restaurant managers to use this for day-to-day decision-making? If so, the reports need to be easily accessible on mobile, they need to be clear and not overly complicated, and they need to be embedded in operational processes.
However, if you want these reports to also be used by head office or by the finance department, there needs to be enough granularity and the capacity to run complex reports.
Having a clear idea of who is using this reporting suite will guide how you build it out and if you haven’t thought this through beforehand, getting the process embedded later will be extremely painful.
3. Identify data sources
Your use cases should guide you through deciding which data sources you need. It’s important to remember that the more data sources, the more complex the project, but the more extensive the reporting capabilities.
At Tenzo we focus on the operational sources including POS, labour scheduler, inventory management, reservations, and social reviews. But, if your use case is more marketing-focused, you could consider including analytics, CRM, or email.
Think about if you need external data as well, such as forecasts that include weather data.
4. Design capabilities and functionalities
The design phase is where you define the technical capabilities and features that your BI reporting suite must offer. This is where you consider all technical features, from aggregating data to implementing security, forecasting, and real-time insights.
Data aggregation
Once you’ve decided what sources you want to connect, you’ll need to assess how you’re going to access the data whether that’s through API connections or database connectors.
Here are some examples of the actions you can take as well as the tools necessary to enable this.
- POS (Point of Sale) Data:
- Technical Action: Utilise API connections or Database connectors to extract sales and customer data from your POS system.
- Tools: Use Power BI Dataflows, Apache Airflow, or Azure Data Factory to link to POS systems’ APIs.
- Labour Management Systems:
- Action: Utilise API connections or Database connectors to fetch staff schedules, work hours, labour costs, and absences.
- Tools: Power BI’s Web Connector, or set up scheduled jobs using Power Automate to retrieve the data.
- Inventory Systems:
- Action: Access data related to stock levels, wastage rates, and purchase orders by connecting to your inventory management systems via APIs, flat files (CSV or Excel), or databases.
- Customer Reviews and External Data:
- Action: External APIs (Google Reviews, Yelp) or connect to weather forecasting or local event data via public APIs.
- Action: Use Power Query to normalise data from multiple formats and sources into a uniform structure within Power BI.
Data collection tools like Azure Data Factory, Power Query, or custom-built ETL pipelines can aggregate the data. Regular refresh schedules (e.g., Power BI scheduled refresh) should be set based on the frequency of data updates (e.g., real-time vs hourly vs daily).
Be alerted to critical insights
Alerts are very helpful for pointing out issues before they become critical problems. Users of the reporting suite don’t have to trawl through data to try and identify incidents.
Once you have your key reports and KPIs set up, you may want to create alerts on them eg if daily sales have dropped below a certain threshold, inventory is too low or labour cost is above budget.
To facilitate this you could use Power BI’s alerting feature (available for numeric tiles, gauges, and card visuals) or Power Automate (formerly Microsoft Flow) to trigger automated processes or notifications when alerts are triggered (e.g., send a Slack message or email).
Should demand forecasting be included?
Is forecasting part of the scope of your reporting project? If you’re looking to optimise staffing or order inventory more effectively, it probably should be. This will require some machine learning expertise likely in the shape of a data scientist who can create machine learning models that forecast sales demand based on historical data trends.
You’ll need to implement Azure Machine Learning, Power BI Python/R scripts, and integration via Power BI Service or Azure AI services such as Cognitive Services to create this functionality.
If you want to layer on more data points like weather or events calendars, these will need to be passed through the machine learning model alongside historical sales data for predictions to be output and then imported back into your visualisation platform.
Key functionality to consider
- Speed: Ensure real-time data availability where necessary. Many reports lose relevance if the data is over 12 hours old.
- Security: Implement role-based security and Row-Level Security (RLS) for sensitive data. Remember that many APIs contain sensitive information like credit card numbers or National Insurance Numbers.
- Ease of Use: Design simple and user-friendly dashboards. There needs to be differing levels of complexity depending on the intended user.
- Device Compatibility: Ensure all reports are mobile-friendly for managers on the go.
5. Select Tools and Build Infrastructure
While you design your capabilities, you’ll need to choose what tools you’re going to use. This will include selecting the right cloud services and tools for your visualisation platform implementation. I’ll also cover how to technically configure each piece of the infrastructure to ensure scalability, security, and performance.
Choose the Core Tools and Technologies
- Cloud Infrastructure:
- Action: Select a cloud service provider and environment that fits your needs. This will host your databases, ETL pipelines, and Power BI datasets to be accessible from anywhere.
- Recommendations:
- Components of Cloud Infrastructure:
- Data Storage: Use SQL databases like Azure SQL Database or Amazon RDS (PostgreSQL or MySQL).
- Data Lake (optional): For large, unstructured data such as raw logs and streaming transactions, use Azure Data Lake or Amazon S3.
- Data Warehouse:
- How:
- Set up a central data warehouse to store integrated sales, labour, and inventory data.
- Make sure the warehouse supports scalable querying for potentially large datasets.
- Steps:
- Use Azure Synapse Analytics or Amazon Redshift for a managed data warehouse solution.
- Design and set up ETL processes to regularly extract, transform, and load data into your warehouse from multiple data sources using tools like Apache Airflow or Azure Data Factory.
- Technical Configuration:
- Partition data by dates (e.g., monthly partitions) to improve query speed.
- Use star schema or snowflake schema design to ensure efficient relation between dimension tables (e.g., staff names, items) and fact tables (POS transactions, shifts).
- How:
- Data Integration & ETL (Extract, Transform, Load):
- Action:
- Use ETL tools to connect and consolidate data from different restaurant systems (POS, inventory, labour).
- Tools:
- Apache Airflow: This is a scalable tool for scheduling and orchestrating ETL jobs. For example, you can schedule daily data consolidations, aggregations, and normalizations.
- Azure Data Factory: Simplifies data pipeline orchestration. Capable of connecting POS systems, Labor Management APIs, Inventory databases, and delivering the processed data to the warehouse directly.
- Process:
- Extraction: Fetch raw data from the POS, labour, inventory, and other systems.
- Transformation: Clean and standardise this data (e.g., formatting dates, standardising item descriptions across systems).
- Loading: Load the cleaned data into your central database or data warehouse for Power BI to query.
- Action:
- Choose the Visualisation Tool, eg for Power BI:
- How:
- As Power BI is the visualisation layer, ensure its datasets are updated regularly with cleaned and consolidated data from the data warehouse.
- Process:
- Design Power BI Dataflows for reusable ETL processes within Power BI itself, or connect to a backend Data Warehouse using SQL-based queries.
- Use Power Query for additional transformations inside Power BI and organise your queries for optimal performance (e.g., applying filters to reduce unnecessary data load).
- Steps:
- Import datasets from the central data warehouse and configure relationships for all relevant fields (sales, labour, stock levels).
- Create calculated measures using DAX (Data Analysis Expressions) to compute complex metrics (e.g., Average Daily Revenue, Gross Profit Margins by Menu Item).
- How:
- Data Refresh Configuration:
- DirectQuery: Real-time reporting without data import, but requires high availability and real-time connections. Great for managers needing up-to-date operational reports.
- Import Data Mode: For non-real-time data or larger data sources, configure a refresh schedule (hourly or daily depending on business needs).
6. Implementation and maintenance
Establish Data Security and Reliability
Make sure that your system is robust and reliable. If there are errors or data sources fail, ensure that you have fallbacks in place that retry extraction until the data is processed successfully and alert you to any problems.
As mentioned above, sensitive information is often available freely on APIs so make sure that you have levels of authorisation in place and restrict what data can be manipulated and accessed across the business. Also, ensure that data is encrypted during transfer and storage.
Future-proofing
If you’re looking to build something as complex as your own reporting suite, then I’m sure you are looking to grow the business. If this is the case, make sure that this work is scalable as you increase your site count and that you can handle ever larger data sets.
Conclusion
If you’re thinking about building your own reporting suite from scratch, hopefully these tips give you a framework to work to. The truth is, these projects are not simple and unless you have the in-house expertise (a team of dedicated developers and data engineers), extremely difficult to implement.
As an alternative, think about Tenzo. We’ve done everything above at scale so you have robust reporting and analysis capabilities without the nightmare of building it from scratch. See how we’re better than a data visualisation platform: a Saas product built specifically for hospitality accessible to anyone in the business.
If you’re thinking about embarking on this project, feel free to get in touch. We’re always happy to chat about reporting structures and give advice where we can!