We built a BI system to provide clients with access to the data that we collect. They can access the data report and various reports by using Power BI.
It is built into the Azure cloud. You can't deploy it otherwise.
We built a BI system to provide clients with access to the data that we collect. They can access the data report and various reports by using Power BI.
It is built into the Azure cloud. You can't deploy it otherwise.
We have a product called ED Tracker, and we allow clients to subscribe to this product, and they use it through Power BI. It enables us to offer new services to clients and basically allows them to work on the data or report themselves, rather than sending them data with PowerPoint decks, PDF reports, etc. So, we work with our clients through this platform. They need to have the license. If they want to access the system, we just tell them that they need to get a license. The license is very cheap. It is $10 a month per user. It is not very expensive, and once they have the license, they can access our cloud solution.
Its connectivity with other Office applications, mostly with Excel, and the ability to deploy it very easily are the most valuable features. It comes sort of bundled with the cloud, so you don't need to set up a server and a standalone infrastructure. So, getting into the system or building something that you can deploy is very easy and very cheap. With other systems, you need to have a server, and you need to have a license for the server. The initial setup is very costly.
It is an evolving solution. So, it still has some rough edges. As compared to Tableau or QlikView, there are some things that you can't do when you want to. For example, giving specific access to some reports for users. You can get it up and running very fast, but some things are a bit trickier, and for some of the things, you need to actually write code.
It is sort of a work in progress. They're catching up on the competition, but it still takes time. Other solutions are more mature, and they have been in the market much longer, but it is catching up. It has come a long way in terms of how it was working two years ago, but there are some things that you still can't do with it. For example, permission management and user access management are still a bit limited. It is basically based on the idea that everybody from the organization can see everything or limit the type of data they can see. If I want you to see only one report and the other guy to see another report, I can't do it. There should be a better way to manage permissions and users. It should also support external users much better.
There should be the ability to export to PowerPoint or PDF. It should be more efficient. It's rather clunky right now. Sometimes, the system is inconsistent in the way it does things.
I have been using this solution for two years.
It is adequate in terms of speed and stability. It is very stable. Sometimes, it is a bit slow. It can be faster, but you need to subscribe and purchase additional packages or resources, and then it becomes more expensive.
We haven't scaled yet, but you have the ability to have a dedicated server on Azure with CPU. You can increase and have an SQL Server, so you can scale it.
As of now, we have around 10 to 12 users internally and externally. Some are internal, and some are external clients. We do have plans to increase the usage because we're trying to sell and market the product to other clients as well. So, we do have plans to increase the number of users. One of the benefits is that it doesn't matter if we have 10, 20, or 50 users. It doesn't inflict any costs on us because they go directly to the cloud. They don't come to us. It is very indirect, but we do plan to extend the usage of that system. We might also extend it internally.
Their technical support is absolutely magnificent. A week ago, we had an issue related to permissions, and we couldn't find out how to do that. My colleague contacted the support of Power BI. They not only answered us by mail; they also had a half an hour session with us on Teams to better understand what our issues were. They wanted us to send them the files. They reviewed them and told us that there were still some limitations, but they were working on them, and they will let us know.
We were stunned that someone from Microsoft is interested in what we're doing and someone is willing to go online and have a half an hour session with us so that we can explain what we're doing and what is our issue, and they can think about how to resolve it. We're a small client. We're not a big company. So, we were stunned by their support. Their support is amazing.
A few years ago, we've tested QlikView and Qlik Sense. Their deployment costs were rather high, so we decided to use Power BI.
It was very easy and straightforward. It was rather quick because you can launch it. It is very easy to publish. They give you direct access to their cloud. For small solutions or datasets like ours, the initial setup was a matter of days. We started with the desktop on-premise, and then we published it to the cloud. It was rather easy. It was a matter of days to a week or two.
We used our own team. Its deployment and maintenance are taken care of by a PM and a colleague of mine. It is very easy. You just press publish, and it's off to the cloud.
In terms of ROI, it is a 10 out of 10.
Its price is very low. It is like $10 per user, per month. The clients pay for their own licenses. It is not on us.
There are no costs in addition to the standard licensing fees. That's the beauty. With other systems, you need to spend a couple of thousand dollars just to get started, and then you need to spend $500 per year for the license, which becomes much more costly. You have a system here where for $120 to $140 a year, you can start with two people and start developing and deploying. You can see why the cost difference is huge, especially when you are on a low scale, like us, and you're not building something very huge.
We didn't evaluate other options because we have had some past experience with other solutions. We knew that QlikView might be good, but you need to spend a couple of thousand dollars just to get started if you want to do something. We knew the costs, and the entry cost was much higher. So, we decided to go with Power BI. It is also integrated with Office and Excel, so it's very easy to go along and do some of the things that you can do in Excel. It is very easy to transition between them.
If you are looking for a good BI solution for a small business that is very easy to deploy and not costly and that can use the cloud in terms of security, Power BI is probably the best solution in the market.
I would rate it a nine out of 10. There are other solutions that might be better than this, but they're more costly. It is the cheapest BI solution in the market. It is not the best in terms of features, but it is the best in terms of value for money. For the volume of work that we have, there is absolutely no competition.
We are experts in process analysis for the biggest companies in Poland.
Software AG is trying to sell us on their cloud version, but for now, we still using the on-premises version on our server.
MashZone is a very simple system for analysis because it connects widgets for object analysis. You do not have to try to develop something in SQL or another programming language. Everyone who knows something like Microsoft Excel, for example, will have enough experience to try doing analysis with MashZone. It is that simple. It is easier to use than Qlik, more simple than Tableau, et cetera.
The ease-of-use of MashZone is better than any of the competition for data analysis.
Within the last month, Software AG sent us a new version of MashZone. Some of the features in MashZone have been modified and there are two new functionalities that are quite interesting to us. But we are still in the testing stage so there is not much to say about the new version yet and if they have resolved some things that are currently issues for us.
The interface is good. It is very simple. It may not have many functions, but it is enough for process analysis. By comparison, Qlik and Tableau have many, many more functions. Statistical analysis is impossible in MashZone, but for process analysis, it is enough.
MashZone is the best for process analysis. It is a very solid system. It is not as good for BI (Business Intelligence) because when you have big data, you have a problem. When we start to aggregate data and we have a source of something like one million rows, we have a big problem with MashZone because it is a very slow system.
I have worked with MashZone for about two years. I started work with my present company and this company working with Software AG (Aktiengesellschaft / Corporation) for many, many years. But for me, it is only the last two years that I have been with them that I have had experience with the product.
I think the stability depends on the specialists you have in your company. It is a very stable system when you have good employees in your company who know what they are doing and how to work with the product. If the IT department that you have is good, then MashZone is very stable.
The number of users that we have changes a lot, up and down, because of periodic fluctuations. We are a company which sells the systems to many big companies. In the last half-year, I worked for the Polish government postal services, Poczta Polska. They have about 1,000 users. But now, in our system, we have only about 10 users. The product can scale to the number of users.
The technical support is a problem because the tech support team in Germany is the only group with enough experience. In Poland, we have had many situations where we have had a problem that we needed to address for our customers. Usually, we had to talk with the technical support in Germany to get the issue resolved because the depth of knowledge in Poland is a problem. If you get technical support in Germany they have a very good understanding of the product and the support is excellent.
We sell Software AG solutions and when we talk about process analysis, I think Software AG has the best solution. ARIS (Architecture of Integrated Information Systems) products like Architect, Connect, and Process Mining are very good solutions for process analysis which I have used before.
We are using only Software AG for our process analysis at my current company. But sometimes a customer will ask about some solution for statistical analysis or for financial analysis, et cetera. When they need this type of analysis we might use another product which we research or which they request.
It is quite difficult to say how the installation is for common users. This is because we have high competency in our company and many people who know the product well. When I came to our company, I worked with my colleague at the next desk and he knows every answer to every question without doing any research. For people with experience, this product will be easier to install and use.
Because we sometimes get data from systems and companies that need different analysis, we need the best application for analysis of the data type. We think the best solution for us when we need to do statistical analysis might be Qlik. I could not be sure without trying to compare, so we did some testing. The Qlik solution has memory technology that acts differently and we see a big difference with using Qlik for this type of analysis when comparing it to MashZone.
The advice I have for people considering MashZone is that it is very good for specific things. When you have big data you must buy MashZone with Terracotta, because Terracotta is a second Software AG solution with better memory technology. Only using MashZone along with Terracotta will make the processing fast enough.
I think the biggest problem with MashZone is that even though you have a lot of widgets and the system is simple to develop with, it is not a very fast system. It is a very simple system to use, more simple than Tableau in my opinion. But the biggest problem is speed. It is not fast with processing.
I think because the system is really designed for process analysis, when we analyze a process for our customers, the MashZone is much better than all other solutions. But the database for processes can not be very big. If there are about 40,000 rows for process analysis — and this is a pretty big analysis — then MashZone is enough. But when you want to analyze something using statistics or for financial analysis, MashZone is not a very good system. This is not really what it was designed to do and excel at.
On a scale from one to ten (where one is the worst and ten is the best), I would rate MashZone NextGeneration overall as a product as a seven-out-of-ten.
The reasons why it gets a seven is because of the problems the product has in processing big data. If I try to analyze big data in MashZone, it is just impossible. I changed the server for a better one and it is still is not enough processing power. This is only my opinion based on the version of the software that we are using (10.3). But now they have a new version (10.5) which has some new features that we are very interested in and maybe I will score the product as more than seven when I have the opportunity to use them. For example, in the new version, Software AG claims that now there is a business component for developing our own templates for MashZone.
The second problem that Software AG said they have tried to address is the speed of MashZone. Prior to this release, MashZone did not have enough capabilities to do proper Gant analysis. Now Software AG says that in new version Gant analysis is better. We do not know for certain because we have not yet put it to the test. For now, when we using version 10.3, that is practically impossible. Of course, something is there to do this type of analysis in version 10.3, but it is very complicated to use.
As a service company, we were using Spotfire for things like timesheet analysis for our own purposes. What was more important was that we were advising our corporate customers, chemical, oil and gas companies like Shell, Total, and Exxon about where they should use this tool and how to develop an application using it. I'm really on the service side and Spotfire was a tool that I could offer to my customers for delivering projects.
I championed the use of this solution in my company (an engineering service company) and ultimately we delivered a lot of projects. Often, we were working on development, more specifically user interface development, graphical user interface, and other things that were extremely costly and time consuming. When the BI tools arrived, it seemed to be a very fast way to not only analyze data, but also provide interactive dashboards to people, which before would've required the development of a custom tool. This would've been magnitudes higher in terms of price, so I really saw an opportunity there with the BI tools.
At this stage, it was clear that Spotfire was the top runner. I don't have much experience with Tableau, but clearly for scientific applications, Spotfire was awesome. Especially because of its integration with things like RNG. As an engineer, I was very excited.
To be completely frank, the main problem of Spotfire is that it's being destroyed by Power BI. That's the only problem. Otherwise, the product is superior from a technical perspective, but they are victim of an extremely aggressive strategy from Microsoft and therefore, become far too expensive because Power BI is free in organizations. That's Spotfire's biggest weakness.
Another thing is the realtime boarding capabilities and the integration with other realtime streaming products should be much easier. The handling of realtime data could be improved.
I would also improve the consumption of realtime data. I'd also probably improve integration with the RNG, and generally speaking, the data science techniques. I think this is where Spotfire can still play a role and be competitive compared to Power BI. Other than that, I love the product, but I don't know how they can survive the offensive of Microsoft commercially.
I have been using this solution for around six years.
I've had projects running for a while and have no particular complaints about stability. Originally, Spotfire is an offline analysis tool, so stability isn't a huge issue. It's much more of an issue when you do realtime statistical treatment with realtime data . That's where I mentioned they have to improve.
Tipco is a huge company and they have this policy of acquiring software all the time, which is an interesting yet aggressive policy for development. The problem when you keep acquiring companies is that at some point, you have to integrate the products. That's where things tend to take a lot of time. Afterwards, the integration can sometimes be wishy washy and I think this is what happens in the realtime space.
Spotfire has identified that there was something in the market that was asking for the consumption of realtime data and the provision of realtime dashboards and analysis. What they've done is half integrate another product that they bought and I found this strategy weak. I think this is where the stability will really become a critical factor, but overall, I would say so far so good.
I think scalability is pretty good. I've seen customers running thousands of reports. For reaching good scalability, it also depends on your network architecture and whether you host it on cloud or not. I would say there's nothing in the software that really worries me, but you can always mess it up.
Overall it's quite good. As a partner, there's a big difference because my request for assistance is usually prioritized over regular customers. I probably had access to the hotline, but it's very clear that when you're a partner and you meet the commercial team, they know that you're pre-sale, and so you get a lot better answers from them than you do from the hotline. Overall, I have nothing to complain about, but I'm not blown away either.
It depends how you deploy it and the use case. If you just want to install it on your computer and get going for self-service usage, that's a matter of five minutes, so that's extremely easy. If you want to deploy it at the corporate level with the web server, that can be more complex. For a corporate analysis solution or corporate dashboarding solution, it's more complex, but also not unexpected.
When you've installed or deployed the software in your company, you're still nowhere; that's probably the IT department's problem generally speaking: Is the product installed? Is someone doing something with it? Is there the necessary skillset around it? Are your engineering and technical personnel able to operate it on their own? That seems to be the least of their worries, but this is why you pay for the software.
I would say that in terms of a deployment strategy, everyone will do the ITPs. If you have an IT person in your company that knows half of what they're doing, there's no problem. You manage.
Nevertheless, putting a correct strategy around the use of business intelligence tools in the organization, it's relevant to ask: What will we use it for? What kind of training will we provide? What kind of algorithm will we develop on top of that? I told you that as an integration with the RNG, for example,which is extremely powerful, I think these are big questions that people often forget.
They think that just by a data science or dashboarding tool, results will come out of it miraculously, but no results come out of the software. Results come out of people using the software, so I would say the challenge in terms of a deployment strategy and the time it takes is a complete function of the ambition level that you have. We can deploy it in one night, but then it has no impact on the company and if you really want to make an impact on the company, then I believe that you are looking at deployment activities, which are much longer. Particularly after the deployment, you should probably do something to maintain the life cycle of the product. So I would say an ideal deployment is deployment that never ends.
To summarize, I would say from a technical point of view, deployment should be the least of your problems. It can be easy. Nevertheless, using a quick strategy around the use of business intelligence, that's something different. I believe that all decision makers should really focus their attention on this and not on something as silly as whether the solution is easy to install.
I would rate this solution an eight or nine out of ten.
It's one of the best products I've worked with in my career. Especially in the engineering, oil and gas, or chemistry fields. A long time ago, you found a lot of niche software players, which were terrible software. Being able to introduce something modern like Spotfire was really fresh air for us. It's an excellent product. The ability to customize as well is really good.
Particularly for us, because as a service company, we tend do things that are a bit more advanced than what the production people do and therefore, I was very pleased with it.
I collated all the reports that we got from Domo's APIs, then performed some ETLs and processing so we could build a final output from which the dashboard would get powered. Then, we created all types of stuff in Domo. At that point, the license let us use all the available jobs in Domo. Therefore, we were using tables and pie charts. For demographics, we are using the geographical charts for Australia and the USA, as the brands we deal with are mainly from Australia and the USA.
31 million rows of data are getting processed every hour within Domo.
Domo has their own internal servers and phone apps.
I was using Domo comprehensively and exclusively in my previous organization. In this organization, the visualization has been improved. There were glitches when you went from one page to another, but that lag has been corrected.
The basic levels of Domo were not made for developers. It was made for anyone who is coming from a nontechnical background. They can utilize Domo on the fly, e.g., if you have data and want to see a type of visualization on the fly, then you can use Domo to quickly examine your data.
Domo is a comprehensive tool in ETL, visualization, and the media features that we use for the direct connection to all the digital marketing platforms. For the database, we had two to three types of ETL that we could use. It comprehensiveness was major for us.
The API systems are very good. They were an attractive feature of Domo at the time of purchased.
The new feature data is pretty amazing that they are providing for insights on the side of charts. If you don't even want to be in the dashboard, then there is a quick dashboard that they are creating based on the data you are uploading. You don't have to write a single piece of the code. You just have to upload your data, then you can use all of the visualizations, which is a new feature that I really like. A person who doesn't know much about programming or SQL can see his numbers on a graph, pie charts, and bar charts.
Domo is not a difficult tool to learn. All you need to know is the SQL for the ETL part. You don't need to write much code. That's the great part. It uses legacy languages, like SQL, which is very common among developers who then don't have to go and learn Domo's own syntax. Therefore, you don't have to learn another hard language to use Domo.
The ETL way of storing is not up to mark. You have to rely on the naming convention that you're using in Domo because there are no folder systems where you can collate all your workflows and put them into separate folders. A folder system should be there so you can easily identify how you are working. Once you want to make some changes to your ETL, then you can see the whole lineage, identifying what is there and not there. I felt that this could be drastically improved.
The utilization part: We cannot play much with the UX/UI.
While they have APIs, they kept on failing if the data volume was too large. There was a 10 to 20 percent chances that it would fail. I don't know what improvements they have done in the past year since I have used it, but previously the failures were quite consistent in the API stuff. I would like to see them work on that.
When you are looking at a full-fledged product, you want pretty dashboards or storyboarding. In these cases, you cannot use Domo. That's the drawback. It's exclusively for exploratory data analysis (EDA).
I worked in Domo from 2018 to late 2019. I am going through some migrations from Domo to some other tool. Before that, for about a year and a half, I was developing the deal and visualizations, then getting connections between the API data in Domo to extract all the digital marketing data. Mainly, I was laying in the digital marketing domain, like Facebook, Amazon, and Google ads. These were being heavily used as KPIs in my organization. Right now, I am in the touch with the tool for the visualization and deal part, but not for the API connection.
The product is quite stable. From my point of view, it's quite a good tool to use if you need all types of analysis. Stability-wise, it's doing well. I don't see any lag or other glitches apart from ones that I mentioned for improvement.
Not many people are needed for the maintenance of this solution. Management of Domo is very easy. Apart from developer access, we can keep it to limited people. Normal users looking at visualization are given read-only access. Therefore, in terms of access, you can define the roles of the users. That's easy to manage.
They are doing well with scalability where other companies are struggling with it. Domo is providing a cool feature that other companies struggle to work on, which is something amazing to see. Innovation-wise, Domo is doing well.
In my organization, there are four users who use the Domo license. Two of them are managers, another is the group head, and the fourth is an analyst.
The technical support is very responsive. They are ready to reply, always having a solution ready. They are good at their work and what they do.
We previously used Tableau. We shifted to Domo because Tableau was getting expensive and the features that we get in Domo are what you get in Tableau.
The deployment was very easy. You don't have to buy your own server. These visualizations are nice because they have their own structure to handle these things, which is a good feature.
There was another company who was entrusted with Domo's setup.
My previous organization is still using Domo and are happy with what Domo is giving them.
The price that they offered was around $200 per user license. It was pretty cheap at that time compared to other companies. I think they have revamped their pricing structure since then.
Our company purchased a private license for approximately 20 users.
With Domo's competitors, you have to go in separately, buying your own server. The drawback with Domo is it doesn't allow us to work on the UX/UI much because of the layout. You cannot go around doing a full comprehensive view with Domo. If you have seen Tableau or QlikView, they provide very good UX/UI in their products. This makes their dashboard appealing to see. Domo lacks that and was not a product created for storyboarding. It is more for analysis.
The advanced analytic charts are easy to create. If you compare it with other tools in the market, it's very easy to check your data and build charts.
Go for it. The product is quite good. I would rate it as a seven and a half (out of 10).
It makes things more attractive and simpler. When you come to the analytics part, you want things to be simpler because there are other areas that you want to focus on than just creating a dashboard.
BusinessObjects has a lot of tools, including Web Intelligence, Crystal Reports, Analysis for Office, SAP Lumira, and Analytical Cloud. SAP also has a new tool for HANA-based applications it introduced around 2018. Analysis for Office is an SAP add-on inside Microsoft Office. It works inside of Office tools like Excel, so you have the option to get data from Excel, and there's a direct connection with SAP. You can point that to your HANA database or a BEx query also.
You can also connect SAP to PowerPoint, so you can create presentations from the HANA database or a BEx query. We had more than 180 to 200 reports on Analysis for Office in my last implementation. Most of our company users were good at Excel, so it was easy to use an external data connection to Excel.
For example, say we have different sheets in Excel. We populate the data from the BEx query or the HANA database in the first sheet. In the second, we'll do some options like the lookup function for Match Index and the reports. The data will be constantly refreshed in the backend. Finally, we have to create the report and publish it to the SAP BI Launchpad to be shared with everyone.
The other thing is the WEBI, or Web Intelligence report. That's the most powerful reporting feature inside BusinessObjects. We normally use WEBI for ad hoc reporting, not for dashboarding, because the dashboard visualization is not that great. WEBI will work even if you have more than 10 million rows.
WEBI will work with any amount of data. I have more than 100 gigabytes of data in WEBI. It's best for ad hoc reporting instead of dashboards. SAP has its own dashboard tool inside BusinessObjects dedicated to dashboards and visualizations. You cannot do any ad-hoc reporting inside that.
In terms of the dashboard, they introduced another tool called Design Studio. Design Studio is another took inside SAP BusinessObjects. Design Studio is better for dashboarding and summary reporting. For example, you can take a data table and create a graphical representation. That's SAP Design Studio, and WEBI is a tool we use globally.
All of our SAP Businessthey will always prefer to work in WEBI, Web Intelligence. WEBI has two versions. One is inside the launch pad that is a browser-based tool. Second, you can have a tool of WEBI inside your desktop itself, that is called Web Intelligence Rich Client. Web Intelligence Rich Client is the same tool as the second version, there are two versions of WEBI, one is inside the browser, and the second is, you can install it on the desktop. Lumira is comparable to Tableau, or Power BI. Lumira was introduced in 2013 or 2014. I forgot the year, but it was introduced after Tableau. Lumira has a great story function. There is a story option in Tableau, but that started in Lumira.
SAP had another tool called Explorer. Explorer is a simple tool to preview the data that can be used for both ad hoc reporting and visualization, but they discontinued Explorer in December 2020. Adobe Flash Player was discontinued, and Explorer was completely dependent on Flash. The last tool, SAP Analytics Cloud, is currently strong in the market, and it was introduced in 2020, I think. They prefer SAC. SAC can be used for both ad hoc and dashboard reporting.
There are two tools inside BusinessObjects' schematic layout called the Universal Design Tool and Information Design Tool. These are the most powerful tools that set BusinessObjects' reporting from other solutions.
If my organization has 300 or 400 tables, I can combine all of them into one universe, and everyone can use that. It is just a schematic layout that does not hold any data but the table relationships.
UDT is perfect, and you can do anything in it. There are never any issues when joining the tables because there are a lot of options. In terms of tables, two things always come to mind: looping and traps. These are the main difficulties we face when joining tables, but loops and traps are easily resolved inside BusinessOjbects UDT and IDT. We have API functions and contact operators that resolve these issues.
IDT and UDT form the backbone of BusinessObjects. There is one more thing called publication. I haven't seen this feature in any other tools. Publication is useful for bulk reporting. For example, say I want to send reports to 200 Indian salespeople, and I want to apply a filter so the reports only go to specific cities. This can be done in BusinessObjects in five minutes. This cannot be done in any other tool like Tableau or Power BI.
BusinessObjects reporting tools have not been perfected yet. However, there are two ETL tools inside the BusinessObjects. They are ETL tools in the schematic between the database and the reporting.
But if we're talking negative aspects of BusinessObjects, it's like comparing a bus and a bike. If you want to reach somewhere nearby within five minutes, you can use a bike instead of the bus because there will be a lot of traffic and lots of people inside the bus. If you have large amounts of data, then go for BusinessObjects. If you have a light amount of data, it's better to use Tableau or Power BI tools.
I've worked with SAP BusinessObjects for 10 to 15 years.
BusinessObjects' stability is awesome with a huge amount of data, but you're often running three or four tools at a time. For example, say I want to do reporting in BusinessObjects. First, I have to think about the type of schematic layer I must use: UDT or IDT. Second, I have to think about what type of reporting tool I'll need: ad hoc, detailed summary, or dashboard reporting.
If it is an ad hoc report, I will go for Crystal Report. If it is just dashboard reporting, I've to go for SAC or Lumira. These confusions will be there for every user. If someone wants to really work on BusinessObjects, they should understand at least three or four of its tools. With Tableau, you only need to know about Tableau. You don't have to think about other tools because everything is inside Tableau or Power BI.
BusinessObjects will give you a lot of options. There will be a proper category, like schematic layout developer, report developer, report viewers, etc. And there are different categories of users inside BusinessObjects. Tableau and Power BI don't have such categories.
Factoring in total implementation and maintenance costs, SAP BusinessObjects is too expensive. If you deal with a huge amount of data, you can go with BusinessObjects. However, if you are a medium-sized company with a modest amount of data, you can opt for another solution.
I rate SAP BusinessObjects eight out of 10.