Categories
BI Best Practices

What Qualities are Most Important in a Business Intelligence Project Manager?

What do Data Analysts think about working with Project Managers for BI projects? What makes a good manager and what are some signs of a poor Project Manager?

These are our top three characteristics. And notice they have more to do with philosophy than skills like who can build the longest Gantt Chart.

  1. Vision,
  2. Project understanding
  3. Realistic expectations
  • Having a sense of solidarity with the team, win together, perform together, not be somebody who would throw the team under the bus
  • Having an understanding of the application is very important. If possible, the project should be trained as any other user of the application, and be able to experience it as a user would.  We’ve worked with teams where the PM had no understanding about the application and it made them unable to set realistic expectations. Over time, this became a major liability and cost them the trust of the team.
  • Agile is a well-known standard in any IT environment. It’s becoming socially unacceptable for BI project managers to not know Agile. A good PM focuses on the work and how to get it done, doesn’t pressure team members to do the impossible, takes the time to listen and understand what you’re telling them, helps to remove obstacles, stands up for the truth, and doesn’t make tasks “57% complete”.

In one case I had a BI PM, in the finance vertical, who was extremely helpful. He was very savvy, understood the politics in large corporations and really lead the project to a successful outcome (and around several challenges). I’ve also worked with PMs with software development backgrounds who pushed antiquated waterfall approaches in BI and this had a really negative impact on the team. The PM, in this case, was respected and influential with leadership – but was not democratic with the team, and led everyone in the wrong direction. Given his influence, we went down the wrong path even faster and further before we could recover (which included switching out the PM).

  • It’s not important for the PM to have extensive technical knowledge.  They’re responsible for delivering good work on the proper timeline, not the tech.
  • Teams of analysts expect the requirements to be laid out, and any reference info that you can find attached to the requirements.
  • Requirements are everything with a BI project. Having them complete and at the proper level of details is essential to success. Not hearing about a “must have” feature until late in development is what ruins schedules and frustrates work teams. Also, don’t worry about having super specific requirements docs.
  • Devs don’t want to read them. Especially for dashboards, requirements should be as specific as they need to be to communicate concepts. We don’t need three paragraphs describing what a filter control does. A PowerPoint with sketches and notes is often more effective than a 50+ page requirement doc.
  • Be creative with requirements. Take out your phone and snap photos of people’s whiteboards with critical requirements on them. Some people are more verbal, some are more visual. If a client just drew you something for an hour, you’d better take a photo of it and share it with us, don’t try to document it in words. You’ll get it wrong.
  • Finally, resourcefulness and attention to detail are also appreciated. If the PM does not know where something is, they can at least include a link to a contact who might know. 
  • Good PMs don’t make the analysts stand around waiting for more work. Have a handful of tickets ready to go, because while you’re busy in other all-day meetings the team will be idle, waiting for more assignments.
  • A good BI PM always assigns out a reasonable amount of work, just enough to keep people busy plus a little bit more.  This makes maximum use of people’s time and avoids analyst time being wasted resulting in more work done in less time and far fewer nights- and weekends-worked at the end.
  • Analysts and Devs, myself included, can be dramatic. Expect it. Don’t get offended. But if you treat us well and make our lives easier by setting us up well we will move mountains for you.

Categories
BI Best Practices Business Intelligence

How to Develop Your BI Roadmap for Success

Business intelligence is more than just a buzzword. Today’s BI apps and offerings give companies the edge they need to stay competitive in a market where customers increasingly tune out information that’s not tailored to their likes, needs, and desires. But implementing a business intelligence strategy is a resource-intensive process, and getting it right takes proper planning.

What Is Business Intelligence?

Business intelligence is the process of leveraging data to improve back-office efficiency, spot competitive advantages, and implement profitable behavioral marketing initiatives. Organizations that fail to institute proper business intelligence strategies:

  • Miss out on strategic growth opportunities
  • Routinely fall short on customer satisfaction
  • Overspend on promotional projects with little ROI
  • Remain reactive instead of proactive

Why Do We Need a Business Intelligence Strategy?

Business intelligence initiatives are not plug-and-play propositions, and instituting a BI process without a proper plan is like setting sail with broken navigational equipment. Sure, you can use the sun as a general guide, but the chance of landing at your exact destination is between slim and none. The same goes for businesses without clear and defined BI strategies. Departments will inevitably veer off course and toil toward different ends, and the data quality almost always suffers.

Seven Steps to a Successful BI Strategy

Now that we’ve established why business intelligence strategies are so important, how do you get started when implementing one? Consider these seven critical steps when defining your roadmap.

Step #1: Survey the Landscape

One of the biggest mistakes companies make when embarking down a BI path is failing to survey the landscape. It may sound cliche, but understanding where you are in relation to your desired destination is of paramount importance. During this stage, answer questions like:

  • What obstacles are we likely to face during this process?
  • Where are our competitors, and what are they doing that we’re not?
  • What resources are available that fit our budget?
  • How can we leverage data to increase sales and improve efficiency?

Step #2: Identify Objectives

Once you’ve got a handle on your niche’s market topography, it’s time to set goals. Too often, organizations and companies don’t get specific enough in this phase. While “making more money” or “securing more members” are, technically, goals, they’re too broad. During this stage, drill down your objectives. By how much do you want to grow your customer base? What is a reasonable expectation given market conditions? What metrics will you use to measure progress?

Step #3: Build the Right Team

The goals are in place. Next is team-building. The ideal BI working group is multi-disciplinary. Not only do you need a strong IT arm to handle and transform unstructured data, but it’s also important to include representatives from all the departments that will be using the information.

Step #4: Define the Vision

Defining a BI vision is similar to identifying objectives but not quite the same. In this step, members of the working group share their departmental processes and map out the ideal data flow. Defining objectives deals with end goals; vision mapping is about implementation practicalities. Which departments will receive it and when? How will they get it? Is there a roll-out hierarchy? How will the data be used?

Step #5: Build the Digital Infrastructure

Once the roadmap has been drawn, it’s time to start crafting the data pipeline. This step is mainly the responsibility of either an in-house IT department or a third-party data analytics platform. The ultimate objective of this step is to produce clean data that are distributed to the right people in a useful format.

Step #6: Launch Your Solution

It’s time to launch your system! Yes, by this point, you’ve likely held dozens of meetings — if not more — and tested your data pipeline and reporting systems like there’s no tomorrow. Yet, there’s still a 99.9 percent chance that you’ll need to make adjustments after launch. Expect it and plan for it. 

Step #7: Implement a Yearly Review Process

Pat the team members on their backs. Developing and implementing a business intelligence strategy is no small feat. But also understand that things will change. Your market may shift; your target demographic’s wants and needs will evolve — as will the technology. As such, it’s essential to review your strategy, data pipeline architecture, and goals yearly.

While this roadmap is by no means entirely exhaustive, business intelligence is a must-have in today’s marketplace. Having the technology isn’t enough. Meticulously mapping out a comprehensive strategy is what makes your BI initiative profitable and successful in the long run.

Categories
BI Best Practices Business Intelligence

Self-Service Analytics: Turning Everyday Insight Into Actionable Intelligence

Business intelligence and analytics have become essential parts of the decision-making process in many organizations. One of the challenges of maximizing these resources, though, comes with making sure everyone has access to the analysis and insights they need right when they need them. The solution you may want to consider is self-service BI.

What is Self-Service BI?

The idea behind self-service BI is simple. Users should be able to access reports and analysis without depending on:

  • An approval process
  • A third party
  • Any specific person in the organization

In other words, everyone should be able to ask the person to their left to pull something up. If the boss needs to hear what the details of a report are, their team should be able to access key information without contacting a help desk or a third-party vendor. When they need help, anyone from the top down should be able to instantly address the issue by pointing them to the proper dashboards and tools.

Defining Your Requirements

Before getting too deep into the complexities of self-service BI, it’s important to establish what your requirements are. First, you’ll need to have the resources required to provide self-service to your end-users. If you’re going to have 10,000 people simultaneously accessing dashboards from locations across the globe, that’s a huge difference compared to a company that has 5 people in the same office on a single system.

Scalability is an extension of that issue. If your company has long-term growth plans, you don’t want to have to rebuild your entire analytics infrastructure three years from now. It’s important to build your self-service BI system with the necessary resources to match long-term developments.

Secondly, you’ll want to look at costs. Many providers of BI systems employ license structures, and it’s common for these to be sold in bulk. For example, you might be able to get a discount by purchasing a 500-user license. It’s important that the licensing structure and costs must match your company’s financial situation.

Finally, you need to have a self-service BI setup that’s compatible with your devices. If your team works heavily in an iOS environment on their phones, for example, you may end up using a different ecosystem than folks who are primarily desktop Windows users.

Developing Skills

A handbook has to be put in place that outlines the basic skills every end-user must have. From a data standpoint, users should understand things like:

  • Data warehousing
  • Data lakes
  • Databases

They also should have an understanding of the BI tools your operation utilizes. If you’re using a specific system in one department, you need to have team members who can get new users up to speed company-wide. You’ll also likely need to have team members who are comfortable with Microsoft Excel or Google Sheets in order to deal with the basics of cleaning and analyzing data.

Your users need to be numerate enough to understand broad analytics concepts, too. They should understand the implications of basic stats, such as why small sample sizes may hobble their ability to apply insights to larger datasets.

Understand How Users Will Behave

Having the best tools and people in the world will mean nothing if your team members are always struggling to work the way they need to. This means understanding how they’ll use the system.

Frequently, user behaviors will break up into distinct clusters that have their unique quirks. Someone putting together ad hoc queries, for example, is going to encounter a different set of problems than another user who has macros set up to generate standard reports every week. Some users will be highly investigative while others are largely pulling predefined information from the system to answer questions as they arise.

Within that context, it’s also important to focus on critical metrics. Team members shouldn’t be wandering through a sea of data without a sense of what the company wants from them.

By developing an enterprise-wide focus on self-service BI, you can help your company streamline its processes. When the inevitable time comes that someone needs a quick answer in a meeting or to make a decision, you can relax knowing that your users will have access to the tools, data, and analysis required to do the job quickly and efficiently.

Categories
BI Best Practices Business Intelligence Data Analytics

How to Transform Your Lazy BI Team

Are lazy BI practices getting the best of your team? It’s not uncommon for teams to become entrenched in their usual way of doing things, particularly when repeatable and seemingly mundane tasks are involved. There is always an opportunity for growth and process improvement in any team, but a case of lazy BI can make any new methods or change difficult to implement. 

If you’re looking to ignite change, don’t worry, best practices are always here to lend a hand with everything from data sources to server management. Whether you’re dealing with the self-taught BI wizard of the team or just a tired coworker, here are some strategies that can help. 

If You’re New to the Team

Before we jump into strategies, if you’re in the unique situation of being new to the team, there are a few things you should keep in mind. Though it might be easier for you to see where improvements need to be made as an outside source, it’s important to establish rapport with your teammates before rushing to make changes. 

Begin by observing and make note of potential adjustments to workflows or processes. Additionally, be inquisitive and ask questions to figure out the why behind methods that aren’t considered to be best practice. After you’ve allowed some time to get a feel for the entirety of the situation, consider these methods when developing your approach. 

Why Change if Nothing’s Broken?

Why should you do things differently if your current methods are getting the job done? Don’t be surprised if you receive the ‘if it ain’t broke, don’t fix it’ mentality in response. This is a common and natural resistance to changes proposed, the way you go about bringing people on board is essential.

Use the Laziness to Your Advantage

It’s not uncommon for new processes or best practices to be swept under the rug following their initial introduction. While new ways of doing things might be more efficient and a good idea on paper, no change can survive without successful adoption from the majority.

The key is appealing to less work and effort exerted in the future. Even though it might take time to adjust and create increased work for your team initially, it’s important to emphasize the mass amounts of time they will save in the future.

In this approach, you’ll be responding to the age-old question of “what’s in it for me?” 

Though best practices are better for productivity and the organization as a whole, how will these changes directly benefit those involved? Appealing to the desire at an individual level will increase your chances of successful implementation.

Start Small

The key to any kind of change is to start small. Upheaving the old methods to make way for new ones is extremely disruptive and can be overwhelming to most. Starting small will increase your chances of a successful adoption.

Find something small your team can achieve or begin to change. This is the same philosophy people use when altering their life by doing something as simple as making your bed each day. Though it is a small task, it helps those involved to feel accomplished. This makes one more likely to be productive elsewhere in their day as well as being open to greater change.

Conclusion

Overall, it’s important to remember that there is no ironclad rule or gold standard for the successful adoption of new methods. There is no absolute anecdote or cure to a case of lazy BI. Regardless of which strategies or tactics you’re using to influence change, every team is different in the way they learn and adapt. Each scenario bears its own unique characteristics in terms of behavior, environment, and the topic of change itself. As a leader and a teammate, it’s up to you to access these factors and strategize your approach accordingly.

Categories
BI Best Practices Big Data

7 Best Practices for Effective Data Management

One of the biggest challenges in data management is focusing on how you can make the most of your existing resources. A common solution tossed out as an answer is to implement best practices. What exactly does it take to turn that suggestion into action, though? Here are 7 of the best practices you can use to achieve more effective data management.

1. Know How to Put Quality First

Data quality is one of the lowest hanging fruits in this field. If the quality of your data is held to high standards right from the moment it is acquired, you’ll have less overhead invested in managing it. People won’t have to sort out problems, and they’ll be able to identify useful sources of data when they look into the data lake.

Quality standards can be enforced in a number of ways. Foremost, data scientists should scrub all inbound data and make sure it’s properly formatted for later use. Secondly, redundant sources should be consolidated. You’ll also want to perform reviews of datasets to ensure quality control is in play at all times.

2. Simplify Access

If it’s hard to navigate the system, you’re going to have data management issues. Restrictive policies should be reserved for datasets that deserve that type of treatment due to privacy or compliance concerns. Don’t employ blanket policies that compel users to be in constant contact with admins to get access to mundane datasets.

3. Configure a Robust and Resilient Backup and Recovery System

Nothing could be worse for your data management efforts than watching everything instantly disappear. To keep your data from disappearing into the ether, you need to have a robust solution in place. For example, it would be wise to use local systems for backups while also having automated uploads of files to the cloud.

Right down to the hardware you employ, you should care about resilience, too. If you’re not using RAID arrays on all local machines, include desktops and workstations, start making use of them. 

It’s also wise to have versioning software running. This will make sure that all backup files aren’t just there, but that they’ll point you toward what versions of the files they correspond to. You don’t want to be using portions from version 2.5 of a project when you’re working on version 4.1.

4. Security

Just as it’s important to have everything backed up, everything should also be secure. Monitor your networks to determine if systems are being probed. Likewise, set the monitoring software up to send you notifications for things like volume spikes and unusual activity. If an intrusion occurs, you want to be sent a warning that can’t be ignored even at 3 a.m.

5. Know When to Stop Expanding Efforts

Encouraging sprawl is one of the easiest traps you can fall into when it comes to data management. After all, we live in a world where there is just so much data begging to be analyzed. You can’t download it all. If you think something might be interesting for use down the road, try to save it in an idea file that includes things like URLs, licensing concerns, pricing, and details on who owns the data.

6. Think About Why You’re Using Certain Techniques

The best of operations frequently fail to adapt because they see that things are still working well enough. If the thesis for using a particular technique for analysis has changed, you should think about what comes next. Study industry news and feeds from experts to see if you’re missing big developments in the field. Conduct regular reviews to determine if there might be a more efficient or effective way to get the same job done.

7. Documentation

Someone someday is going to be looking at a file they’ve never seen before. Without sufficient accompanying documentation, they’re going to wonder what exactly the file is and the purpose behind it. Include the basic thoughts that drove you to acquire each dataset. Remember, the person someday looking at your work and wondering what’s going on with it might be you.

Back to blog hepage

Categories
BI Best Practices Business Intelligence

Ad hoc Analysis vs Canned Reports: Which One Should You Use?

If you’re a regular user of any type of data dashboard or analytics system, you’ve likely encountered a serious question about how to produce reports. Do you go with a canned report, or should you create ad-hoc analysis? Both approaches have their virtues, and your circumstances will often dictate which one you use. Let’s take a closer look at the question itself and the available options to make sure you make the right decision the next time the choice comes up.

What is the Difference?

Before getting too involved with this issue, it’s wise to clarify what we mean by canned versus ad hoc. A canned product is one that either:

  • Comes right out the box with your analytics program
  • Is based on a template someone at your organization has created or acquired

Generally, canned reports have limitations. In particular, you usually can’t squeeze more out of them than your BI dashboard allows you to. This can seriously limit customization.

Conversely, ad-hoc analysis is more of an off-the-cuff approach. This generally involves more labor and time because you have to put time into creating and formatting the report, even if your preferred dashboard provides you the necessary tools.

Pros and Cons of Ad Hoc Analysis

Time and labor are the biggest traps when trying to do anything on an ad-hoc basis. Without rails to guide you, the process you use can develop a big of mission creep. As you fiddle around with the knobs on the dashboard, you’ll find that time can get away from you. That can become a major problem, especially in situations where the first commandment of the job is to just get the report done in a timely manner.

Ad hoc analysis has the virtue of specificity, though. Depending on the nature of the report, it can be helpful to take the time to develop deep dives on specific topics. This is, after all, one of the joys of living in the age of Big Data. Most dashboards are equipped for producing on-the-fly items, and you can generate some impressive results in surprisingly little time once you know where all the controls are located.

The learning curve, however, can be a challenge. If you’re dealing with team members who often resist taking initiative or who don’t pick up tech stuff easily, this can create a barrier. Sometimes, giving them a way to can their reports is just the most painless solution.

Pros and Cons of Canned Analysis

Far and away, the biggest pro of using a canned method is speed. In many cases, you only need to verify the accuracy of the numbers before you click an onscreen button. A report can be spit out in no time, making it very efficient.

One major downside of this approach is that people can tune out when they read canned reports. Especially if you’re putting a work product in front of the same folks every few weeks or months, they can flat-out go blind to the repetitive appearance of the reports.

A big upside, though, is that canned solutions reduce the risk of user errors. Particularly in a work environment where folks may not be savvy about tech or layout and design, it’s often best to have as close to a one-click solution in place. This reduces the amount of technical support required to deliver reports, and it can help folks develop confidence in using the system. Oftentimes, people will begin to explore the options for creating ad-hoc analysis once they’ve had some success with the safer and more boring canned option.

In some cases, canned is the only option. For example, a company that has to produce reports for compliance purposes may have to conform to very specific guidelines for what the work product is formatted. It’s best not to stray under such circumstances, especially if your organization has a track record of generating such reports without issue.

The Fine Art of Choosing

As previously noted, your situation will often be the main driver of what choice you might make. If you’re working on a tough deadline, going the canned route has the benefit of making sure you can deliver a minimally acceptable product on time. There’s a good chance literally no one will be impressed with your efforts, but at least the report will be done.

Some topics deserve more attention than a canned product can afford. As long as you’re confident you have the required skills, you should consider putting them to work to do a deep dive in your report. This affords you the chance to tailor graphics and data tables to your audience. Especially when you’re first introducing folks to a particular topic or a unique dataset, this level of extra attention can be a difference-maker.

There is no perfect answer to the timeless question of canned versus ad hoc. Every situation has its parameters, and it’s prudent to be considerate of those requirements when you make your choice. With a bit of forethought about your approach, however, you can make sure that you’ll deliver a work product that will exceed the expectations of your target audience.

Read more similar content here.

Categories
BI Best Practices Business Intelligence

Why BI Projects Tend to Have a High Failure Rate – Ensuring Project Success

BI projects can begin with a simple goal, but can easily go astray. BI work often involves multiple moving parts and actors. These projects can become complex, containing many dependent pieces. Critical decisions made at the wrong level can lead the plan to chaos. Additionally, timelines are sometimes aggressive and don’t fully account for delays. There are many ways that the project can fail, resulting in money wasted, below we will discuss the top three reasons why they tend to fail. Knowing these failure points is a major step toward ensuring BI project success.

Lack of Communication Between IT and Management Risks Your Project’s Success

Miscommunication can often jeopardize project success. It stems from failure to comply with ambiguous requirements involving a project. Thus, abject business outcomes can occur creating a lack of trust in the workplace.

Some examples of miscommunication are…

  • The target isn’t made apparent
  • Insights from the data aren’t clearly stated
  • The users’ wants and expectations aren’t known.

The blame usually falls upon the IT team because of their inability to communicate. Under 10% of IT leaders believe that they effectively communicate with non-IT colleagues.

There can also be fault found in management not promoting effective communication.  For instance, creating dedicated communication activities or displaying the importance of IT communication. Project metrics sometimes aren’t in place, so the expectations aren’t stated. There is also an unrealistic expectation from management that there won’t be delays. Most people don’t work at full capacity every single day of the project. A better timeline needs to be in place for unexpected bumps in the road.

This lack of conversation causes teams to be held back from their full potential. A feasible solution is to hire a communications director for the IT department. Their needs will be heard and conveyed to management, and vice versa. This will bridge the gap of being “lost in translation” between company officers and their IT team. If that option is out of the picture, have more executive support in the IT projects for there to be guidance.

Time to Value of BI Projects Are Unsatisfactory

BI project failure rate is upwards to 80% according to Gartner. Unclear business targets leave analysts with questions like…

  • What do I present to my audience?
  • Are we using the most desirable data?
  • What data should I cultivate for this project?

Along with answering those propositions, data quality issues can emerge. Dirty data can cause BI teams to become stagnant, due to the amount of data prep necessary. Increases in time-to-value occur because of the lack of alignment between business goals and the data. Checkpoints of project progression on what needs to be accomplished should happen quarterly. When implementing or a scaling a solution it can be a long process that causes users to grow impatient. This is caused by the inability to quickly learn and adapt new BI tools or the lack of qualified personnel. Once a product/update is on the market, the cost to benefit analysis comes in. Most BI projects take a while to see the intended profits, this can lead to discouragement.

If requirements weren’t formally stated, the final product can become a big flop. All the time spent on the project won’t have much value.

Data Issues

Some questions to keep in mind when working on BI projects are:

  • Where does the data come from?
  • What is the validity?
  • Do the data sets make logical sense to analyze?

The data needs to be sanitized/filtered, in order to achieve business objectives. Companies collect mass amounts of data and have numerous ways to analyze it. Focusing on the target allows this process to become simplified. Without a fixation on the target goal, “data paralysis” can occur and the objective could be lost. Data paralysis is defined as, “over analyzing a situation to where no decisions are made, causing any forward movement to halt.”

Finding a way to harness specific data that is relevant to the business need; insights can be drawn. Key points are highlighted and then presented through data visualization. There should be a focus on audience and what they need to know when presenting your insights.

Call to Action

Although there are many ways for a BI project to be unsuccessful; lack of communication, time to value issues, and data issues are the top causes. A way to solve these problems is to hire a project manager that has a background in BI. That individual will be able to communicate with the IT department and executives to create an attainable goal between both parties. The project manager can adjust the timeline allowing the IT department to have adequate time to complete the project and reap the rewards. 

Assuring effective communication will allow setting quality expectations and goals to become easier.  Avoiding data quality issues and slowdowns will sidestep schedule delays. The timeline for the BI project should be flexible, issues are bound to happen. It is more likely for human error to arise than technology error, so correcting your team’s actions will save project time and money.

Read more similar content here.

Categories
BI Best Practices Big Data Business Intelligence

What is Ad Hoc Reporting? – Ad Hoc Reporting for Tableau, PowerBI, Excel

What is Ad-hoc Reporting?

If you heard someone using the term “ad hoc reporting” for the first time, you might think they’re using another language, or are at least missing a few letters.  Well, that’s partly true.

Ad-hoc is Latin for “as the occasion requires.”  When you see “ad-hoc,” think “on-the-fly”. Ad-hoc reporting is a model of business intelligence (BI) in which reports are created and shared on-the-fly, usually by nontechnical business intelligence users. These reports are often done with a single specific purpose in mind, such as to provide data for an upcoming meeting, or to answer a specific question.

Under the ad-hoc model, users can use their reporting and analysis solution to answer their business questions “as the occasion requires,” without having to request help from a technology specialist. A key feature of ad-hoc reporting is that it enables, and embodies, self-service BI in most enterprises. Ad-hoc reports can be as simple as a one page data table or as complex and rich as interactive tabular or cross-tab reports with drill-down and visualization features–or present themselves in the form of dashboards, heat maps, or other more advanced forms.

With ad-hoc reports, all the technical user does is set up the BI solution, ensure the data is loaded and available, set security parameters and give the users their account logins. From that point on, the actual reports are created by business end-users.

Ad hoc reporting stands in contrast with managed reporting, where the technical user–the report developer–creates and distributes the report. As you may have guessed already, if your BI tool of choice supports ad-hoc reports, it will be a big time saver for your technical report developers.

Who Uses These Types of Reports?

This depends in large part on a) the type of ad-hoc solution employed, b) the needs of the end-user and c) the user’s confidence with the solution.

The most common creators of ad-hoc reports are business users and departmental data analysts. In some BI shops, ad-hoc reporting access can be shared outside the organization with business partners and outside auditors, who may need secure access to this information.

What is Ad-Hoc Reporting and Analysis Used For?

Ad hoc analysis is performed by business users on an as-needed basis to address data analysis needs not met by the business’s established, recurring reporting that is already being produced on a daily, weekly, monthly or yearly basis. The benefits of self-service BI conducted by ad hoc analysis tools include:

  • More current data: Ad hoc analysis may enable users to get up-to-the-minute insights into data not yet analyzed by a scheduled report.
  • New reports produced in record time: Since these reports may be single-use, you want to produce them as inexpensively as possible. Ad-hoc report features in a BI tool allow users to sidestep the lengthy process that can go into a normal report, including design work, development, and testing.
  • Line-of-business decisions can be made faster: Allowing users — typically, managers or executives — access to data through a point-and-click interface eliminates the need to request data and analysis from another group within the company. This capacity enables quicker response times when a business question comes up, which, in turn, should help users respond to issues and make business decisions faster.
  • IT workload reduction: Since ad hoc reporting enables users to run their own queries, IT teams field fewer requests to create reports and can focus on other tasks.

Although most ad hoc reports and analyses are meant to be run only once, in practice, they often end up being reused and run on a regular basis. This can lead to unnecessary reporting processes that affect high-volume reporting periods. Reports should be reviewed periodically for efficiencies to determine whether they continue to serve a useful business purpose.

The Goal of Ad-hoc Report Creation

Ad hoc reporting’s goal is to empower end-users to ask their own questions of company data, without burdening IT with the task of creating a myriad of reports to serve different functions and purposes. Ad-hoc reporting therefore makes the most sense when a large number of end-users need to see, understand, and act on data more or less independently, while still being on the same page as far as which set of numbers they look at.

For example, a company with a large outside sales force would be the perfect fit for ad-hoc reporting. Each sales rep can set up his own report for his territory, showing performance against sales goals, orders taken, number of visits to each client, etc., in a format that makes the most sense to him. And just as importantly, the numbers used are pulled from the same data sources as the rest of the company, thereby promoting consistency and minimizing surprises at the end of the quarter.

Top 3 Benefits of Ad Hoc Reporting

1. Empowering End-Users to Build Their Own Reports Saves Money and Time

In a study of over 100 analytics managers, 50% of the team’s time was spent working on ad hoc reporting requests, rather than creating new dashboards and analysis. Since the vast majority of ad hoc reporting is single use, then discarded, that is a major waste of valuable analyst time. Why are the analysts doing this? Usually it’s because they’re the only ones in the company with the skills to do so. But this is a huge misuse of resources. That expensive analyst time that should be spent on building re-usable analyses that can benefit a large population of users. Putting that ability into the hands of all the users, with simple ad hoc reporting tools, accomplishes three key things:  1) It frees up expensive analysts and keeps them from performing unchallenging tasks. 2) it makes end users feel empowered and self sufficient 3) it saves the time wasted for a business user to explain their requirements to an analyst and lets them get straight to work.

2. Encouraging Users to Explore Data Increases Data Discovery

Intuitive ad hoc reporting stimulates information sharing among departments. It enables trend recognition along with any potential relevant correlations based on the immediate availability of accurate, up-to-date data. By increasing the chance for discovery, you increase the chances of finding things like inconsistencies, new revenue markets, and more. Giving end users flexible, easy-to-use ad hoc reporting tools makes them feel empowered, lets them be more hands-on with their data. Which would you trust more? A report done by someone in another division, in a different time zone, or something you put together and tested yourself?

3. Ad Hoc Reporting Streamlines Decision-Making

Of all the things ad hoc reporting can be, at its core, it’s a lever to improve decision-making. Reports give a snapshot of the current state of the business through a specific lens – sales, marketing, performance, and other measures. Organized into a sharable and easy-to-read format, all employees can have the same resources and knowledge necessary for swift action. It makes data analysis a team effort.

Benefits of a Web-based Solution

Get critical information to the right people at the right time – Self-service results plus automatic scheduling/delivery of information let you facilitate timely decision making. Users get the information they need when they need it to answer critical, real-time questions.

Flexibility for constantly changing environments – Business needs to evolve. Answers to changing business questions become more critical. It’s impossible to predict what questions and answers users may need in the future.

Saves training costs and time – Streamlines users’ access to critical information. Easy-to-use wizards allow users to get up and running quickly, requiring less time to learn the application and providing clear guidance and saving time to build reports.

Encourages collaboration and information sharing – Users can easily create, organize, publish and make reports available to other users via the Web for on-demand viewing.

Reduces IT workload – The Web-based reporting application itself can be deployed quickly for widespread availability to end-users. Once deployed, it empowers users to build the reports themselves anytime they need the information. No waiting for IT report developers to build them.

What to Look For in a Good Ad-hoc Report Solution

Now that you have an understanding of what ad-hoc reports are, a good reporting solution should check all of the specific boxes in your feature list. It should be intuitive and easy to use by both business users and technologists. It should be broadly accessible with a light footprint so that many people can access it. It should be able to deliver the answers to users questions quickly and cleanly. In short, it should be oriented toward self-service BI and should be lightweight, fast, and easy to use.

A good ad hoc reporting solution will offer the following characteristics:

Easy to use. If it is or even appears to be complicated, many end-users will be turned off and user adoption will suffer. For this reason, some of the better ad-hoc solutions available today offer a basic set of intuitive features that are wizard-driven and will look easy even to the proverbial “non-computer person,” while also offering more advanced sets of tools for the user who feels confident.

Robust. Assuming that adoption is not an issue (see the previous point), the ad-hoc solution should offer end-users what they need to see, understand and act upon their data. Far from being a more hi-tech version of Excel, it should offer interactive features like ad-hoc dashboards, drill-down and drill-through, advanced sorting and filtering, rich visualization tools like heat maps, charts and graphs, etc.

Widely accessible. For it to be truly useful, a BI solution (including ad-hoc reporting) should web-delivered and accessible with a browser. Apart from offering familiar navigability, and security, a Web-based solution is available from virtually anywhere and on any device via Internet connection. Another benefit of a Web-based ad-hoc solution is that the system administrator won’t have to set it up individually on every user’s machine: installing it on the server is enough, and all the users need to access it is a simple URL.

Today’s better Web-based ad-hoc solutions are data-source neutral, meaning that they can connect practically out of the box to most of today’s commonly-used data-sources, including databases, Web-services, flat files, etc. This saves the IT department the burden of creating complex metadata structures as the underlying layer, which is time-consuming, cumbersome and expensive.

If you’re a regular user of any type of data dashboard or analytics system, you’ve likely encountered a serious question about how to produce reports. Do you go with a canned report, or should you create ad-hoc reports? Both approaches have their virtues, and your circumstances will often dictate which one you use. Let’s take a closer look at the question itself and the available options to make sure you make the right decision the next time the choice comes up.

What is the Difference?

Before getting too involved with this issue, it’s wise to clarify what we mean by canned versus ad hoc reporting. A canned product is one that either:

  • Comes right out the box with your analytics program
  • Is based on a template someone at your organization has created or acquired

Generally, canned reports have limitations. In particular, you usually can’t squeeze more out of them than your BI dashboard allows you to. This can seriously limit customization.

Conversely, ad-hoc reporting is more of an off-the-cuff approach. This generally involves more labor and time because you have to put time into creating and formatting the report, even if your preferred dashboard provides you the necessary tools.

Pros and Cons of Ad Hoc Analysis

Time and labor are the biggest traps when trying to do anything on an ad-hoc basis. Without rails to guide you, the process you use can develop mission creep. That can become a major problem, especially in situations where the first commandment of the job is to just get the report done in a timely manner.

Ad hoc analysis has the virtue of specificity, though. Depending on the nature of the report, it can be helpful to take the time to develop deep dives on specific topics. This is, after all, one of the joys of living in the age of Big Data. Most dashboards are equipped for producing on-the-fly items, and you can generate some impressive results in surprisingly little time once you know where all the controls are located.

The learning curve, however, can be a challenge. If you’re dealing with team members who often resist taking initiative or who don’t pick up tech stuff easily, this can create a barrier. Sometimes, giving them a way to can their reports is just the most painless solution.

Pros and Cons of Canned Analysis

Far and away, the biggest pro of using a canned method is speed and repeatability. In many cases, you only need to verify the accuracy of the numbers before you click an onscreen button. A report can be spit out in no time, making it very efficient.

One major downside of this approach is that people can tune out when they read canned reports. Especially if you’re putting a work product in front of the same folks every few weeks or months, they can flat-out go blind to the repetitive appearance of the reports.

A big upside, though, is that canned solutions reduce the risk of user errors. Particularly in a work environment where folks may not be savvy about tech or layout and design, it’s often best to have as close to a one-click solution in place. This reduces the amount of technical support required to deliver reports, and it can help folks develop confidence in using the system. Oftentimes, people will begin to explore the options for creating ad-hoc analysis once they’ve had some success with the safer and more boring canned option.

In some cases, canned is the only option. For example, a company that has to produce reports for compliance purposes may have to conform to very specific guidelines for what the work product is formatted. It’s best not to stray under such circumstances, especially if your organization has a track record of generating such reports without issue.

The Fine Art of Choosing

As previously noted, your situation will often be the main driver of what choice you might make. If you’re working on a tough deadline, going the canned route has the benefit of making sure you can deliver a minimally acceptable product on time. There’s a good chance literally no one will be impressed with your efforts, but at least the report will be done.

Some topics deserve more attention than a canned product can afford. As long as you’re confident you have the required skills, you should consider putting them to work to do a deep dive in your report. This affords you the chance to tailor graphics and data tables to your audience. Especially when you’re first introducing folks to a particular topic or a unique dataset, this level of extra attention can be a difference-maker.

There is no perfect answer to the timeless question of canned versus ad hoc. Every situation has its parameters, and it’s prudent to be considerate of those requirements when you make your choice. With a bit of forethought about your approach, however, you can make sure that you’ll deliver a work product that will exceed the expectations of your target audience.

Read more similar content here.

Polk County Schools Case Study in Data Analytics

We’ll send it to your inbox immediately!

Polk County Case Study for Data Analytics Inzata Platform in School Districts

Get Your Guide

We’ll send it to your inbox immediately!

Guide to Cleaning Data with Excel & Google Sheets Book Cover by Inzata COO Christopher Rafter