Applications Archives | eWEEK https://www.eweek.com/enterprise-apps/ Technology News, Tech Product Reviews, Research and Enterprise Analysis Mon, 23 Dec 2024 22:23:56 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 Looker vs. Power BI: Latest Software Comparison https://www.eweek.com/big-data-and-analytics/looker-vs-power-bi/ Thu, 14 Dec 2023 13:00:30 +0000 https://www.eweek.com/?p=220590 Looker by Google and Microsoft Power BI are both business intelligence (BI) and data analytics platforms that maintain a strong following. These platforms have grown their customer bases by staying current with the data analytics space, and by enabling digital transformation, data mining, and big data management tasks that are essential for modern enterprises. In […]

The post Looker vs. Power BI: Latest Software Comparison appeared first on eWEEK.

]]>
Looker by Google and Microsoft Power BI are both business intelligence (BI) and data analytics platforms that maintain a strong following. These platforms have grown their customer bases by staying current with the data analytics space, and by enabling digital transformation, data mining, and big data management tasks that are essential for modern enterprises. In particular, both of these vendors have begun investing in tools and resources that support data democratization and AI-driven insights.

As two well-regarded data analytics platforms in the BI space, users may have a difficult time deciding between Looker and Power BI for their data management requirements. There are arguments for and against each, and in this comparison guide, we’ll dive deeper into core features, pros, cons, and pricing for Looker and Power BI.

But before we go any further, here’s a quick summary of how each product stands out against its competitors:

  • Looker: Best for current Google product users and others who are most interested in highly configurable and advanced analytics capabilities, including data visualizations and reporting. Looker Studio in particular balances ease of use with high levels of customization and creativity, while also offering users a lower-cost version of an otherwise expensive platform.
  • Power BI: Best for current Microsoft product users and others who want an easy-to-use and affordable BI tool that works across a variety of data types and use cases. This is considered one of the most popular BI tools on the market and meets the needs of a variety of teams, budgets, and experience levels, though certain customizations and big data processing capabilities are limited.

Looker vs. Power BI at a Glance

Core Features Ease of Use and Implementation Advanced Data Analytics Integrations Pricing
Looker Dependent on Use Case Dependent on Use Case
Power BI Dependent on Use Case Dependent on Use Case

What Is Looker?

An example dashboard in Looker.
An example dashboard in Looker. Source: Google.

Looker is an advanced business intelligence and data management platform that can be used to analyze and build data-driven applications, embed data analytics in key organizational tools, and democratize data analysis in a way that preserves self-service capabilities and configurability. The platform has been managed by Google since its acquisition in 2019, and because of its deep integration within the Google ecosystem, it is a favorite among Google Cloud and Workspace users for unified analytics projects. However, the tool also works well with other cloud environments and third-party applications, as it maintains a fairly intuitive and robust collection of integrations.

Key features of Looker

The Looker Marketplace interface.
The Looker Marketplace includes various types of “Blocks,” which are code snippets that can be used to quickly build out more complex analytics models and scenarios. Source: Google.
  • Comprehensive data visualization library: In addition to giving users the ability to custom-configure their visualizations to virtually any parameters and scenarios, Looker’s data visualization library includes a wide range of prebuilt visual options. Traditional visuals like bar graphs and pie charts are easy to access, and more complex visuals — like heatmaps, funnels, and timelines — can also be easily accessed.
  • “Blocks” code snippets: Instead of reinventing the wheel for certain code snippets and built-out data models, Looker Blocks offers prebuilt data models and code to help users quickly develop high-quality data models. Industry-specific, cloud-specific, and data-source-specific blocks are all available, which makes this a great solution for users of all backgrounds who want to get started with complex models more quickly.
  • Governed and integrated data modeling: With its proprietary modeling language and emphasis on Git-driven data storage and rule development, users can easily build trusted and governed data sources that make for higher-quality and more accurate data models, regardless of how many teams are working off of these models.

Pros

  • Looker comes with a large library of prebuilt integrations — including for many popular data tools — and also offers user-friendly APIs for any additional integrations your organization may need to set up.
  • Looker’s visualizations and reports are easy to customize to your organization’s more specific project requirements and use cases; it also offers one of the more diverse visualization libraries in this market.
  • LookML allows users to create centralized governance rules and handle version control tasks, ensuring more accurate outcomes and higher quality data, even as data quantities scale.

Cons

  • On-premises Looker applications do not easily connect to Looker Studio and other cloud-based tools in user portfolios, which severely limits the ability to maintain data projects accurately and in real time for on-prem users.
  • Looker uses its own modeling language, which can make it difficult for new users to get up and running quickly.
  • Some users have had trouble with self-service research and the vendor’s documentation.

What Is Power BI?

An example Power BI dashboard.
An example Power BI dashboard. Source: Microsoft.

Microsoft Power BI is a business intelligence and data visualization solution that is one of the most popular data analytics tools on the market today. As part of the Microsoft Power Platform, the tool is frequently partnered with Microsoft products like Power Automate, Power Apps, and Power Pages to get the most out of data in different formats and from different sources. Its focus on ease of use makes it a leading option for teams of all backgrounds; especially with the growth of its AI-powered assistive features, visualization templates, and smooth integrations with other Microsoft products, it has become one of the best solutions for democratized data science and analytics.

Key features of Power BI

Microsoft Power BI visualizations.
Power BI is considered one of the best mobile BI tools for many reasons, including because its visualizations and dashboards are optimized for mobile view. Source: Microsoft.
  • AI-driven analytics: AI-powered data analysis and report creation have already been established in this platform, but recently, the generative AI Copilot tool has also come into preview for Power BI. This expands the platform’s ability to create reports more quickly, summarize and explain data in real time, and generate DAX calculations.
  • Dynamics 365 integration: Power BI integrates relatively well with the Microsoft Dynamics CRM, which makes it a great option for in-depth marketing and sales analytics tasks. Many similar data platforms do not offer such smooth CRM integration capabilities.
  • Comprehensive mobile version: Unlike many other competitors in this space, Microsoft Power BI comes with a full-featured, designed-for-mobile mobile application that is available at all price points and user experience levels. With native mobile apps available for Windows, iOS, and Android, any smartphone user can quickly review Power BI visualizations and dashboards from their personal devices.

Pros

  • Power BI can be used in the cloud, on-premises, and even as an embedded solution in other applications.
  • The user interface will be very familiar to users who are experienced with Microsoft products; for others, the platform is accompanied by helpful training resources and ample customer support.
  • This platform makes democratized data analytics simpler, particularly with AI-powered features and a growing generative AI feature set.

Cons

  • While some users appreciate that Power BI resembles other Microsoft 365 office suite interfaces, other users have commented on the outdated interface and how it could be improved to look more like other cloud-based competitors.
  • Especially with larger quantities of data, the platform occasionally struggles to process data quickly and accurately; slower load times, crashes, and bugs are occasionally introduced during this process.
  • Visualizations are not very customizable, especially compared to similar competitors.

Best for Core Features: It Depends

Both Looker and Power BI offer all of the core features you would expect from a data platform, including data visualizations, reporting and dashboarding tools, collaboration capabilities, and integrations. They also offer additional features to assist users with their analytical needs. Power BI offers support through AI assistance and Looker supports users with prebuilt code snippets and a diverse integration and plugin marketplace.

Microsoft maintains a strong user base with its full suite of data management features and easy-to-setup integrations with other Microsoft tools. It can be deployed on the cloud, on-premises, and in an embedded format, and users can also access the tool via a comprehensive mobile application.

Looker is web-based and offers plenty of analytics capabilities that businesses can use to explore, discover, visualize, and share analyses and insights. Enterprises can use it for a wide variety of complex data mining techniques. It takes advantage of a specific modeling language to define data relationships while bypassing SQL. Looker is also tightly integrated with a great number of Google datasets and tools, including Google Analytics, as well as with several third-party data and business tools.

Looker earns good marks for reporting granularity, scheduling, and extensive integration options that create an open and governable ecosystem. Power BI tends to perform better than Looker in terms of breadth of service due to its ecosystem of Microsoft Power Platform tools; users also tend to prefer Power BI for a comprehensive suite of data tools that aren’t too difficult to learn how to use.

Because each tool represents such a different set of strengths, it’s a tie for this category.

Best for Ease of Use and Implementation: Power BI

In general, users who have tried out both tools find that Power BI is easier to use and set up than Looker.

Power BI provides users with a low-code/no-code interface as well as a drag-and-drop approach to its dashboards and reports. Additionally, its built-in AI assistance — which continues to expand with the rise of Copilot in Power BI — helps users initiate complex data analytics tasks regardless of their experience with this type of technology or analysis.

For some users, Looker has a steep learning curve because they must learn and use the LookML proprietary programming language to set up and manage their models in the system. This can be difficult for users with little experience with modeling languages, but many users note that the language is easy to use once they’ve learned its basics. They add that it streamlines the distribution of insights to staff across many business units, which makes it a particularly advantageous approach to data modeling if you’re willing to overcome the initial learning curve.

The conclusion: Power BI wins on general use cases for a non-technical audience whereas Looker wins with technical users who know its language.

Best for Advanced Data Analytics: Looker

While both tools offer unique differentiators for data analytics operations, Looker outperforms Power BI with more advanced, enterprise-level data governance, modeling, and analytics solutions that are well integrated with common data sources and tools.

Both tools offer extensive visualization options, but Looker’s data visualizations and reporting are more customizable and easier to configure to your organization’s specs and stakeholders’ expectations. Looker also streamlines integrations with third-party data tools like Slack, Segment, Redshift, Tableau, ThoughtSpot, and Snowflake, while also working well with Google data sources like Google Analytics. As far as its more advanced data analytics capabilities go, Looker surpasses Power BI and many other competitors with features like granular version control capabilities for reports, comprehensive sentiment analysis and text mining, and open and governed data modeling strategies.

However, Looker has limited support for certain types of analytics tasks, like cluster analysis, whereas Power BI is considered a top tool in this area. And, so far, Power BI does AI-supported analytics better, though Google does not appear to be too far behind on this front.

It’s a pretty close call, but because of its range of data analytics operations and the number of ways in which Google makes data analytics tasks customizable for its users, Looker wins in this category.

Also see: Best Data Analytics Tools 

Best for Integrations: It Depends

When it comes to integrations, either Power BI or Looker could claim the upper hand here.

It all depends on if you’re operating in a Microsoft shop or a Google shop. Current Microsoft users will likely prefer Power BI because of how well it integrates with Azure, Dynamics 365, Microsoft 365, and other Microsoft products. Similarly, users of Google Cloud Platform, Google Workspace, and other Google products are more likely to enjoy the integrated experience that Looker provides with these tools.

If your organization is not currently working with apps from either of these vendor ecosystems, it may be difficult to set up certain third-party integrations with Power BI or Looker. For example, connecting Power BI to a collaboration and communication tool like Slack generally requires users to use Microsoft Power Automate or an additional third-party integration tool. Looker’s native third-party integrations are also somewhat limited, though the platform does offer easy-to-setup integrations and actions for tools like Slack and Segment.

Because the quality of each tool’s integrations depends heavily on the other tools you’re already using, Power BI and Looker tie in this category.

Best for Pricing: Power BI

Power BI is consistently one of the most affordable BI solutions on the market. And while Looker Studio in particular helps to lower Looker’s costs, the platform is generally considered more expensive.

Power BI can be accessed through two main free versions: Power BI Desktop and a free account in Microsoft Fabric. The mobile app is also free and easy to access. But even for teams that require more functionality for their users, paid plans are not all that expensive. Power BI Pro costs only $10 per user per month, while Power BI Premium is $20 per user per month.

Looker, on the other hand, is more expensive, requiring users to pay a higher price for its enterprise-class features. The Standard edition’s pay-as-you-go plan costs $5,000 per month, while all other plans require an annual commitment and a conversation with sales to determine how much higher the costs will be.

Additionally, there are user licensing fees that start at $30 per month for a Viewer User; users are only able to make considerable changes in the platform as either a Standard User or a Developer User, which costs $60 and $125 per user per month respectively.

Power BI takes the lead when it comes to pricing and general affordability across its pricing packages.

Also see: Top Digital Transformation Companies

Why Shouldn’t You Use Looker or Power BI?

While Looker and Power BI are both favorites among data teams and citizen data scientists alike, each platform has unique strengths — and weaknesses — that may matter to your team. If any of the following qualities align with your organizational makeup, you may want to consider investing in a different data platform.

Who Shouldn’t Use Looker

The following types of users and companies should consider alternatives to Looker:

  • Users who want an on-premises BI tool; most Looker features, including useful connections to Looker Studio, are only available to cloud users.
  • Users who are not already working with other Google tools and applications may struggle to integrate Looker with their most-used applications.
  • Users with limited computer-language-learning experience may struggle, as most operations are handled in Looker Modeling Language (LookML).
  • Users who want a lower-cost BI tool that still offers extensive capabilities to multiple users.
  • Users in small business settings may not receive all of the vendor support and affordable features they need to run this tool successfully; it is primarily designed for midsize and larger enterprises.

Who Shouldn’t Use Power BI

The following types of users and companies should consider alternatives to Power BI:

  • Users who need more unique and configurable visualizations to represent their organization’s unique data scenarios.
  • Users who are not already working with other Microsoft tools and applications may struggle to integrate Power BI into their existing tool stack.
  • Users who consistently process and work with massive quantities of data; some user reviews indicate that the system gets buggy and slow with higher data amounts.
  • Users who work with a large number of third-party data and business apps; Power BI works best with other Microsoft tools, especially those in the Power Platform.
  • Users who consistently need to run more complex analytics, such as predictive analytics, may need to supplement Power BI with other tools to get the results they need.

If Looker or Power BI Isn’t Ideal for You, Check Out These Alternatives

Both Looker and Power BI offer extensive data platform features and capabilities, as well as smooth integrations with many users’ most important data sources and business applications. However, these tools may not be ideally suited to your team’s particular budget, skill sets, or requirements. If that’s the case, consider investing in one of these alternative data platform solutions:

Domo icon.

Domo

Domo puts data to work for everyone so they can extend their data’s impact on the business. Underpinned by a secure data foundation, the platform’s cloud-native data experience makes data visible and actionable with user-friendly dashboards and apps. Domo is highly praised for its ability to help companies optimize critical business processes at scale and quickly.

Yellowfin icon.

Yellowfin

Yellowfin is a leading embedded analytics platform that offers intuitive self-service BI options. It is particularly successful at accelerating data discovery. Additionally, the platform allows anyone, from an experienced data analyst to a non-technical business user, to create reports in a governed way.

Wyn Enterprise icon.

Wyn Enterprise

Wyn Enterprise offers a scalable embedded business intelligence platform without hidden costs. It provides BI reporting, interactive dashboards, alerts and notifications, localization, multitenancy, and white-labeling in a variety of internal and commercial apps. Built for self-service BI, Wyn offers extensive visual data exploration capabilities, creating a data-driven mindset for the everyday user. Wyn’s scalable, server-based licensing model allows room for your business to grow without user fees or limits on data size.

Zoho Analytics icon.

Zoho Analytics

Zoho Analytics is a top BI and data analytics platform that works particularly well for users who want self-service capabilities for data visualizations, reporting, and dashboarding. The platform is designed to work with a wide range of data formats and sources, and most significantly, it is well integrated with a Zoho software suite that includes tools for sales and marketing, HR, security and IT management, project management, and finance.

Sigma Computing icon.

Sigma

Sigma is a cloud-native analytics platform that delivers real-time insights, interactive dashboards, and reports, so you can make data-driven decisions on the fly. With Sigma’s intuitive interface, you don’t need to be a data expert to dive into your data, as no coding or SQL is required to use this tool. Sigma has also recently brought forth Sigma AI features for early access preview.

Review Methodology

Looker and Power BI were reviewed based on a few core standards and categories for which data platforms are expected to perform. The four categories covered below have been weighted according to how important they are to user retention over time.

User experience – 30%

When it comes to user experience, we paid attention to how easy each tool is to use and implement and how many built-in support resources are available for users who have trouble getting started. Additionally, we considered how well the platform performs under certain pressures, like larger data loads, security and user control requirements, and more complex modeling and visualization scenarios. Finally, we considered the availability of the tool in different formats and how well the tool integrates with core business and data applications.

Scalability and advanced analytics compatibility – 30%

Our review also considered how well each platform scales to meet the needs of more sophisticated analytics operations and larger data processing projects. We paid close attention to how the platform performs as data loads grow in size and complexity, looking at whether user reviews mention any issues with lag times, bugs, or system crashes. We also considered what tools were available to assist with more complex analytics tasks, including AI-powered insights and support, advanced integrations and plugins, and customizable dashboards and reports.

Integrability – 20%

We considered how well each tool integrated with other software and cloud solutions from the same vendor as well as how easy it is to set up third-party integrations either via prebuilt connectors or capable APIs. In particular, we examined how well each platform integrated with common data sources outside of its vendor ecosystem, including platforms like Redshift, Snowflake, Salesforce, and Dropbox.

Cost and accessibility – 20%

For cost and accessibility, we not only focused on low-cost solutions but also on how well each solution’s entry-level solutions perform and meet user needs. We assessed the user features available at each pricing tier, how quickly pricing rises — especially for individual user licenses or any required add-ons, and whether or not a comprehensive free version was available to help users get started.

Bottom Line: Looker vs. Power BI

Microsoft’s Power BI has consistently been among the top two and three business intelligence tools on the market, recruiting and retaining new users with its balance of easy-to-use features, low costs, useful dashboards and visualizations, range of data preparation and management tools, AI assistance, and Microsoft-specific integrations. It is both a great starter and advanced data platform solution, as it offers the features necessary for citizen data scientists and more experienced data analysts to get the most out of their datasets.

Power BI tends to be the preferred tool of the two because of its general accessibility and approachability as a tool, but there are certain enterprise user needs for reporting and analytics distribution where Looker far outperforms Power BI. And for those heavily leaning on Google platforms or third-party applications, Looker offers distinct advantages to skilled analysts.

Ultimately, Looker doesn’t really try to compete head-to-head with Microsoft, because they each target different data niches and scenarios. It’s often the case that prospective buyers will quickly be able to identify which of these tools is the best fit for their needs, but if you’re still not sure, consider reaching out to both vendors to schedule a hands-on demo.

Read next: Best Data Mining Tools and Software

The post Looker vs. Power BI: Latest Software Comparison appeared first on eWEEK.

]]>
Newgen’s Low-Code Platform Geared for Digital Transformation https://www.eweek.com/enterprise-apps/newgens-low-code-platform-geared-for-digital-transformation/ Tue, 10 Oct 2023 22:16:53 +0000 https://www.eweek.com/?p=223161 Old school developer lead initiatives are tried and true but won’t lead to digital success

The post Newgen’s Low-Code Platform Geared for Digital Transformation appeared first on eWEEK.

]]>
Digital transformation isn’t something that happens in isolation. It affects the entire enterprise and, if not done correctly, can disrupt every company aspect.

The critical elements of digital transformation to keep in mind are customer experience, operational excellence, and innovation. Specifically:

  • Delight your customers: You need to optimize customer experiences, ensuring a smooth, automated journey that is connected and personalized across all channels.
  • Stay flexible, productive, and efficient: Hybrid workforces demand a lot. Build an ethic of operations excellence that can handle all the needs of your company while staying secure and accountable is critical.
  • Look to the future: Innovation is about optimizing day-to-day operations so you can focus on new products and opportunities.

A “Do It Yourself” Approach is Hard to Execute

Executing digital transformation by yourself is a daunting prospect. And, even with some of the common platforms available, the heavy lifting can be overwhelming.

Getting the coders you need to get everything done is a thankless task—something that might be impossible in today’s labor market. But rather than putting this on next year’s to-do list, there are ways to do everything you need now using a different approach: No code.

We’ve all heard the soothing terms low code and no code. But how viable are they? ZK Research has evaluated several solutions in the past few years, and we see them as a shortcut around development cycles that used to take months or years.

One such platform—from Newgen, an India-based company whose platform can automate all aspects of a business for efficient operations and end-to-end customer journeys—can deliver the business outcomes we described above. The company’s NewgenONE platform enables the following:

  • Automate manual processes and applications.
  • Speed up transformations with low/no code.
  • Create insights from documents, images, videos, and audio files.
  • Extract more from existing systems without a complete rebuild.
  • Build a culture of innovation across a company’s ecosystem, including internal groups, customers, and partners.
  • Scale and secure the cloud.

Also see: Top Digital Transformation Companies

The Platform

The NewgenONE platform automates key parts of a business to make operations more efficient and ensure customer journeys are seamless.

One Newgen customer, Bank Midwest, a leading community bank in the Midwest region with assets totaling $1.3 billion, chose NewgenONE to transform its processes and customer experience with a tailored solution that eliminated the need for multiple-point solutions. The bank efficiently integrated these features into their projects, paving the way for enhanced efficiency and scalability to meet future business demands.

Another financial organization—Georgia’s Own Credit Union, one of the largest credit unions in Georgia—partnered with Newgen to transition 65% of its new applications online, reducing the application journey time to just five minutes. In addition, NewgenONE reduced back-office processing time by 35%.

Bottom Line: Low Code/No Code Delivers

Low code/no code platforms can often be all talk and no reality. But, after speaking to some of their customers, ZK Research believes NewgenONE platform can deliver on the promise.

Also, Newgen has been recognized by other research firms, most notably Forrester and Gartner, as having a complete and easy-to-use platform. The company says, on average, customers have achieved an ROI of 371% over three years, which is in-line with the case studies I cited above.

Executing on digital transformation using old-school tools will not scale and can often lead to unsuccessful projects. Low code / no code masks much of the complexity that developers will face and simplifies digital projects.

Read next: Digital Transformation Guide

The post Newgen’s Low-Code Platform Geared for Digital Transformation appeared first on eWEEK.

]]>
Challenges of Microsoft Teams: User Experience and Unbundling https://www.eweek.com/enterprise-apps/challenges-of-microsoft-teams-user-experience-and-unbundling/ Tue, 12 Sep 2023 23:21:32 +0000 https://www.eweek.com/?p=222966 Every week, I have many calls and meetings—Zoom, Cisco Webex, Google Meet, Skype (believe it or not), and the dreaded Microsoft Teams. Whenever I get an invite for a Teams call, I wonder what will happen when I click the link. Teams can be unpredictable. Say what you will about the other players, but they […]

The post Challenges of Microsoft Teams: User Experience and Unbundling appeared first on eWEEK.

]]>
Every week, I have many calls and meetings—Zoom, Cisco Webex, Google Meet, Skype (believe it or not), and the dreaded Microsoft Teams. Whenever I get an invite for a Teams call, I wonder what will happen when I click the link.

Teams can be unpredictable. Say what you will about the other players, but they just work.

Zoom started as a standalone product—and if it didn’t work, the company would’ve gone under. Webex was one of the earliest players, so they know how to make meetings work. Google Meet always feels strange, but it’s generally competent, although essentially a “me too” product. Skype is, well, Skype. It’s nothing fancy—kind of AOL Messenger on steroids—but it works.

Poor User Experience is the Teams Challenge

The trouble with Teams is that, in an effort to integrate all its products, Microsoft appears to have forgotten about the experience of actually using the individual product. Most people don’t care about integration—or, more correctly, users don’t care about integration if the product itself doesn’t work.

The Teams experience, at least for me and many users I have talked to, has not been good. Most times I go to use it, I get stuck in some kind of login issue. On more than one occasion, almost as a throwback to the late 90s or early 2000s, I’ve resorted to just using the dial-in numbers.

If I can solve login, I have to change the settings for backgrounds, etc., as I’m getting set to enter a meeting. Meanwhile, the clock is ticking, and I’m late for a meeting.

One of the most significant issues with Teams is guest access. If I host a meeting or call, I’ll use Zoom, so I often join Teams as a guest.

The experience as a guest is inconsistent. Some features, like chat, work on some calls and not others. It also creates multiple user IDs, which can create confusion for end users. This is why many organizations use Teams for internal meetings but a product like Zoom for external. Guest access is better in the latest release but it’s still not on par with other vendors.

Also see: Digital Transformation Guide

The EU and Unbundling

Once inside Teams, things work all right. Yet the experience, compared to Zoom or Webex, is severely lacking. But there may be hope on the horizon—in the unlikely guise of EU regulators. You may have read that the EU has prompted Microsoft to unbundle Teams.

One news article I read about the unbundling said Teams was considered the crown jewel of the Office 365 suite. I’m not sure about that, but it’s apparent that Microsoft is unhappy about unbundling. From a business standpoint, no one would be happy about this. Government regulators interfering in the way a company operates is generally abhorrent.

Looking back, Microsoft wasn’t happy about making it easier to uninstall Explorer after wrangling with the Justice Department in the late ’90s and early 2000s. But they survived allowing non-Microsoft software into its walled garden and grew into a trillion-plus dollar company—despite years of feckless management under Steve Ballmer. 

For users of Teams in the EU, unbundling could be a good thing if Microsoft approaches it correctly. Thinking about the user experience for Teams itself rather than the overall integration should help the company address the usability issues.

The ability to link to a doc from SharePoint is probably not a high priority for people just trying to have a seamless meeting. But those integrations should be relatively easy to maintain. One big step the company could take in improving usability is to acquire a company like Box. One might think that Box and Microsoft OneDrive are overlapping products, but the reality is that Box improves the Microsoft user experience better than Microsoft’s own product.

I know many people think I’m anti-Teams but, in reality, I’m anti-bad user experience. I want to like Teams. I really do. I hope that the company will use the unbundling to free it from those shackles everywhere and focus its usability experts on creating a great product that is easy to use and just works.

Microsoft Needs to Prioritize Usability 

I might be shouting into the void, but Microsoft product engineers might want to rethink how they approach Teams. It’s a simple thought: Prioritize the core functionality that users need or want. Then, approach that development from the inside out, like Zoom did. Think about enabling great meetings and messaging. Worry about everything else later.

The challenge with Microsoft is that the company appears to prioritize locking customers into license bundles over everything else, including usability. By making the job of procuring the product so frictionless, it can use that audience to push substandard products on corporate workers. This is why so many workers will say something like, “I really don’t like using Teams, but the IT department made the decision.”

The danger that unbundling creates is that it removes that lock and creates an opening for better products. Proof of that is how fast Google managed to have Chrome usurp Internet Explorer’s once dominant position.

If you’re a customer of Teams in one of the countries where it will be unbundled, do your due diligence and look at Zoom, RingCentral, Webex, 8×8, Avaya, or any of the other many products and use Box as the solution to integrate data across them. You’ll have happier customers and likely save yourself some money.

Read next: Top Digital Transformation Companies

The post Challenges of Microsoft Teams: User Experience and Unbundling appeared first on eWEEK.

]]>
RingCentral Acquires Hopin Assets For Hybrid Events https://www.eweek.com/enterprise-apps/ringcentral-acquires-hopin-assets-for-hybrid-events/ Mon, 07 Aug 2023 16:39:26 +0000 https://www.eweek.com/?p=222821 RingCentral has broadened its collaboration platform through the acquisition of select assets from Hopin, an online audience engagement technology provider. This acquisition includes Hopin’s flagship Events platform and Session product, signaling a strategic move by RingCentral to bring customers more dynamic, interactive video solutions through virtual events. For more information, also see: Digital Transformation Guide Hopin […]

The post RingCentral Acquires Hopin Assets For Hybrid Events appeared first on eWEEK.

]]>
RingCentral has broadened its collaboration platform through the acquisition of select assets from Hopin, an online audience engagement technology provider.

This acquisition includes Hopin’s flagship Events platform and Session product, signaling a strategic move by RingCentral to bring customers more dynamic, interactive video solutions through virtual events.

For more information, also see: Digital Transformation Guide

Hopin Provides a Single Solution for Hybrid Events

Hopin Events is a one-stop shop for managing all aspects of an event. Using the platform, companies can easily set up and run events that are completely online, or a mix of online and in-person events like conferences, training sessions, or even virtual expos with sponsor booths.

The platform provides unique features, such as the ability to host multiple sessions at once, tools for networking, and options to sign up and register for events. While many collaboration vendors offer a virtual event platform, few have a hybrid solution where the in-person and digital events can be managed simultaneously.

Hopin Session makes meetings more interactive and personalized. With Session, companies can customize the user experience through things like breakout rooms for smaller group discussions, polls to gauge attendees’ opinions, and much more. The idea is to make meetings feel less like a one-sided presentation and more like a group conversation.

RingCentral Rounds out Its Portfolio with Hybrid Events

With Hopin’s technology, RingCentral can offer a more rounded service portfolio. However, the acquisition wasn’t only about technology, as it brings nearly 100 Hopin employees, many of them technical experts, into the RingCentral fold.

This is a significant injection of new talent into RingCentral’s operations, which already covers video meetings, webinars, and digital conference rooms, said Vlad Shmunis, RingCentral’s founder and CEO, during a news briefing with analysts.

“The acquisition of Hopin brings fresh energy and innovation to RingCentral’s video services. We now have a complete solution—meetings, rooms, webinars, and events—enhanced by the unmatched customization capabilities of Hopin’s Session portfolio. This will offer a superior experience for hosting a meeting or an event.”

Also see: Top Digital Transformation Companie

Bringing Simplicity to Hybrid Events

RingCentral has established a reputation for simplifying business communications with its video tool suite. The suite includes RingCentral Video, an online platform enabling users to arrange or join meetings from anywhere; RingCentral Rooms, which turns any location into a video conference room; and RingCentral Webinar, capable of hosting large online gatherings of up to 10,000 participants from any computer, web browser, or mobile device.

As RingCentral integrates Hopin’s technology into its core offerings, the user experience is set to undergo significant enhancements, Shmunis added. RingCentral is reinventing itself as an artificial intelligence (AI)-driven company, delivering smartphone solutions, an intelligent contact center, and business solutions for companies of all sizes.

Historically, events have not been considered part of a collaboration suite, but that sentiment is changing. Digital collaboration is about sharing content and ideas, and events do that in a massively scalable way. Businesses can now use RingCentral for one-to-one meetings, one-to-few brainstorming sessions, one-to-many webinars, and company meetings and how one-to-thousands in an event format.

Also see: 100+ Top AI Companies 2023

The post RingCentral Acquires Hopin Assets For Hybrid Events appeared first on eWEEK.

]]>
Tanium Adds Digital Experience Management to its XEM Platform https://www.eweek.com/enterprise-apps/tanium-digital-experience-management-to-its-xem-platform/ Fri, 28 Apr 2023 21:04:51 +0000 https://www.eweek.com/?p=222182 Cybersecurity and endpoint management company Tanium is broadening its portfolio with the launch of its digital employee experience (DEX) solution. The solution is part of Tanium’s converged endpoint management (XEM) platform, which allows IT teams to proactively measure and manage employees’ digital experiences with self-help capabilities, automated remediation, sentiment surveys, and notification features. Contrary to […]

The post Tanium Adds Digital Experience Management to its XEM Platform appeared first on eWEEK.

]]>
Cybersecurity and endpoint management company Tanium is broadening its portfolio with the launch of its digital employee experience (DEX) solution. The solution is part of Tanium’s converged endpoint management (XEM) platform, which allows IT teams to proactively measure and manage employees’ digital experiences with self-help capabilities, automated remediation, sentiment surveys, and notification features.

Contrary to traditional approaches, such as helpdesk tickets, Tanium XEM with DEX is designed to help IT maintain and improve the employee experience to keep tickets from being opened in the first place. It provides employees with automated remediation and allows them to fix issues affecting their devices. DEX solutions focus on the endpoint and encompass hardware, operating systems, apps, and cloud services.

Also see: The Successful CISO: How to Build Stakeholder Trust

Legacy IT Management: Time-Consuming and Reactive

One of the challenges with the legacy approach to end-user management is that it’s reactive in nature. A worker who has a problem with an application stops work, calls the help desk, and then the troubleshooting starts. My research shows that with the traditional approach, three-quarters of help desk tickets are opened by the end user versus the IT organization. This means IT pros are always in fire-fighting mode as they are reacting to issues. Tanium DEX gives IT the visibility to get ahead of issues, even ones that users don’t know about.

Digital experience tools have existed for some time but saw an uptick in adoption during the pandemic. When employees shifted to working from home, IT departments scrambled to fix broken experiences, implement virtual private networks (VPNs), and move apps to the cloud. The pandemic highlighted the need for DEX solutions to adapt to changing work environments.

Poor Experience Impacts Workers and Customers

“Productive employees deliver great work, which is why CIOs must prioritize the employee experience in this new hybrid world. In fact, the digital employee experience is increasingly the only work experience for many of us” said Chris Hallum, director of product marketing at Tanium.

“A poor DEX not only impacts your employees but has a potentially negative impact on your customers. Employees struggling to effectively use their tools aren’t going to deliver their best work and it will undoubtedly impact how they engage with customers.”

Tanium Taps Desktop Information 

While there are a number of digital experience solutions available, Tanium’s approach is different. Many of the solutions use network traffic and infer problems.

For example, the network can see the latency of a Zoom call and if it falls below a threshold, it knows there is a problem. This is fine if the problem is actually the network but what if it’s something on the computer? In this case, the network solution could identify the problem based on Zoom performance but would not know the source. Tanium’s DEX can see things like browser issues, memory problems, or other factors that are tied to the endpoint as well as network performance.

Prior to DEX solutions, self-service solutions like portals have been used but those have seen limited success. Unified endpoint management (UEM) solutions are effective for provisioning and deployment but lack experience monitoring and employee sentiment surveying capabilities.

Organizations that don’t have visibility into employee experiences need to establish feedback loops for better IT management. In my discussion with Tanium, Hallum told me that automated self-remediation is a highly desired feature for detecting and fixing issues.

Tanium DEX consists of two modules with an end-user component:

  • A Performance module: observes performance conditions across endpoint devices and servers, identifying trends and systemic problems.
  • An Engage module: can identify real-time trouble spots, help users fix issues without IT intervention, and can be used to survey employees on the quality of their experiences.

Also see: Secure Access Service Edge: Big Benefits, Big Challenges

DEX Captures Valuable User Sentiment

The solution measures sentiment through surveys after an issue has been resolved, fostering a bi-directional relationship between IT and employees. It also it can be used to send notifications related to training and important updates. Lastly, it provides a health score for each endpoint, so IT can address low-performing endpoints and set improvement targets.

“With the automation that Tanium brings to the table, organizations can reduce their helpdesk load and redeploy the resources to do more strategic work,” said Hallum. “They can provide better service levels for the really hard issues, as opposed to just constantly responding to the 10% of the issues that represent 80% of the helpdesk tickets.”

Another capability for Tanium DEX is that with its inclusion in the Tanium XEM platform customers get endpoint, vulnerability management, and DEX capabilities all in a single integrated solution. There are many synergies here.

For instance, if the remediation of a digital experience issue requires the deployment of a patch, Tanium DEX can use the XEM platform to make that happen in real time and also at scale across the entire enterprise. Tanium’s access to any type of data, setting, or file on the endpoint, and nearly unlimited remediation capabilities, are some of this its most powerful and differentiated capabilities.

Tanium has integration with ServiceNow, which brings Tanium data into the ServiceNow console. But newer features like health scores are not yet included in ServiceNow with this version.

Focused on Endpoint and App Experience

At the moment, with this first release of Tanium DEX, Tanium is clearly focused on endpoint and app experiences. Hallum assured me, however, that other aspects of the employee experience, like cloud service performance, are something Tanium is pursuing.

Businesses are competing on customer experience (CX) today but it’s impossible to deliver a best-in-class CX without having a quality employee experience. When workers have poor performing apps, that creates frustration and can often be directed back at the customer. Businesses need to shed the legacy reactive approach to end user support and embrace technologies that enable IT pros to move to a more proactive model.

Also see: Top Digital Transformation Companies

The post Tanium Adds Digital Experience Management to its XEM Platform appeared first on eWEEK.

]]>
Real-Time Data Management Trends https://www.eweek.com/big-data-and-analytics/real-time-data-management-trends/ Thu, 23 Mar 2023 14:27:20 +0000 https://www.eweek.com/?p=220394 Real-time data management is the application of intelligence to data as soon as it’s created or acquired, rather than being stored for later analysis. Data is processed and forwarded to users as soon as it’s collected – immediately without any lag. This ultra-rapid data management is considered crucial for supporting real time, in-the-moment decision making. […]

The post Real-Time Data Management Trends appeared first on eWEEK.

]]>
Real-time data management is the application of intelligence to data as soon as it’s created or acquired, rather than being stored for later analysis. Data is processed and forwarded to users as soon as it’s collected – immediately without any lag. This ultra-rapid data management is considered crucial for supporting real time, in-the-moment decision making.

Real-time data is especially valuable for businesses, for a multitude of reasons. It can provide immediate insight into sales trends, and it can also provide immediate insight to security vulnerabilities or degradation of the corporate IT infrastructure.

With digital transformation initiatives well underway, companies are investing in strategies to ingest large volumes of data that enable them to make the right decisions in the moments that matter. Handling the sheer volume and complexity of this data store is exceptionally challenging.

As enterprises meet these data-intensive digital demands, here are five real-time data management trends we anticipate over the next year.

Also see: 7 Digital Transformation Trends Shaping 2022

Data Visualization to Identify Patterns and Trends

Whether it’s real time or sitting at rest in a database, data is nothing more than numbers without visualization. To bring real-time data to life, you need real-time data visualization. Visualization, such as charts, graphs, maps, or other colorful displays, can give you an edge over the competition by mapping out not only the data, but where it is going in terms of activity.

Where data visualization works best is helping to alert companies if something is abnormal or out of the ordinary. For example, a spike in outbound network traffic would show up on the meter, giving a visual cue to anyone watching that there is a sudden flow of traffic which should be investigated.

But it also has positive benefits as well. If a company notices an uptick in sales during certain parts of the day, that would show up on a chart as well and command immediate attention. Real-time visualization enables people to take action in case of emerging opportunities, to capitalize on a potential opportunity or possible negative outcome.

This enables decision-makers to get ahead of the curve and respond quickly and early on, rather than after the fact when an opportunity is missed or damage is done. It also encourages more data interaction, because no one wants to sit and pour through numbers. In contrast, looking at a graph or a chart which summarizes 24 hours of activity in one picture is much more accessible.

With real-time data analysis, companies can identify trends and monitor how well they are achieving goals. They can access data remotely, monitor purchases, manage resources, and help secure their network.

Also see: Top Data Visualization Tools

Data Security is More Important Than Ever

The federal government has issued a Federal Zero Trust Strategy that requires, among other things, adoption of Zero-Trust security measures across the federal government and private sector.

The zero trust strategy will enable agencies to more rapidly detect, isolate, and respond to cybersecurity threats and intrusions. The Office of Management and Budget (OMB) has issued a series of specific security goals for agencies aligned with support existing zero trust models.

For the unfamiliar, zero trust is a new networking design that takes its name literally. One of the knocks on cyber security is that once the bad guys have breached your firewalls, they can move around within your network with impunity. Zero trust network requires validation and credentials to move anywhere within the network. It is a much stricter networking protocol designed to bottle up anyone who breaches the outer wall.

Because of this design, the federal strategy puts a great deal of emphasis on enterprise identity and access controls, including multi-factor authentication, along with encryption and comprehensive auditing.

Also see: Best Data Analytics Tools

Invest in Digital Tools That Improve the Customer Experience 

Research from Boston Digital found 83% of customers are willing to switch to a brand with a better digital experience, and 70% of customers are more likely to trust brands that provide a great digital experience.

That means that customers are increasingly loyal to brands that offer a direct relationship with them and knows their wants and needs. That also means they won’t hesitate to leave you if they’re not getting the experience they want from you, and it’s extremely easy to switch consumer loyalties these days.

Therefore, it is important to invest in the right tools to continuously and actively optimize your digital presence to meet customer needs, and to stay on top of changing trends as they constantly shift.

Also see: Guide to Data Pipelines

Businesses will Reinvent Customer Profiles with Real-Time Data

For years, companies have easily fetched customer identity and other information from cookies. With the end of cookies in 2022, and with more than 70 percent of the world’s population protected by privacy regulations, businesses will have to adapt to new targeting strategies to quickly recommend a product or decide if a transaction is fraudulent.

As identity becomes less of a fixed or known data point, enterprises need to immediately analyze a massive swath of data, look for patterns, and extrapolate a likely persona for targeting. They will need to find patterns in real time that target individuals based on attributes or behaviors other than a cookied identity.

As data volumes constantly grow and demand for real-time transactions increases, these trends will traverse all industries where scaling is instrumental to survival. For example, ad tech is experiencing a renaissance, garnering significant investment, innovation, and attention as platforms rapidly seek to serve ads to targeted audiences at petabyte scale.

Likewise, with massive amounts of data streaming from mobile, 5G, and IoT sensor applications, telecom companies need to quickly ingest data and then process it at petabyte scale with virtually no latency.

As we move forward, enterprises need to embrace the opportunities and challenges ahead of them and manage real-time data in new ways to drive successful business outcomes.

Bottom Line: Real Time Data Management in 2023

Real-time data management is the new normal in business. The days of batch processing on a mainframe overnight and analyzing it the next day are long gone and hard to find. The good news is that there are plenty of tools out there to enable it, ranging from Apache open-source software to commercial software from leaders like IBM and SAP.

Data is also coming from many new sources that weren’t around 10 or 20 years ago. This includes social media, the edge/IOT, and mobile users. So real-time data analytics does not just include your databases and network security, it includes your entire enterprise.

At the same time, data is more regulated than ever. There have been significant data breaches in recent years, and companies have paid massive fines for their sloppiness. So regulatory compliance and protecting your data is as important as extracting value from the data.

About the Author: 

Lenley Hensarling, Chief Strategy Officer, Aerospike

Additionally, tech journalist Andy Patrizio updated this article in 2023. 

The post Real-Time Data Management Trends appeared first on eWEEK.

]]>
IBM’s Vikram Walecha on Low Code and Tech Disruption https://www.eweek.com/enterprise-apps/ibms-vikram-walecha-low-code-tech-disruption/ Fri, 24 Feb 2023 20:40:15 +0000 https://www.eweek.com/?p=221974 I spoke with Vikram Walecha, CTO, ServiceNow at IBM, about how the dramatic rise of low code software enables non-tech staff to build important aspects of tech infrastructure. Among the topics we covered:  How is low code disrupting the tech market? What advice do you give to companies that want to do more with low […]

The post IBM’s Vikram Walecha on Low Code and Tech Disruption appeared first on eWEEK.

]]>
I spoke with Vikram Walecha, CTO, ServiceNow at IBM, about how the dramatic rise of low code software enables non-tech staff to build important aspects of tech infrastructure.

Among the topics we covered: 

  • How is low code disrupting the tech market?
  • What advice do you give to companies that want to do more with low code?
  • How is ServiceNow/IBM addressing the low code needs of its clients?
  • The future of low code in the enterprise? What can we expect in the years ahead?

Listen to the podcast:

Also available on Apple Podcasts

Watch the video:

The post IBM’s Vikram Walecha on Low Code and Tech Disruption appeared first on eWEEK.

]]>
11 Best Predictive Analytics Solutions https://www.eweek.com/big-data-and-analytics/predictive-analytics-solutions/ Thu, 16 Feb 2023 17:35:00 +0000 https://www.eweek.com/?p=221417 Predictive analytics software and tools help companies make data-driven decisions. They offer data analytics features that allow users to data mine large datasets and predict future outcomes – to the extent that’s possible in a rapidly changing marketplace. These predictive analytics tools help organizations create robust models that can detect patterns, uncover trends, and ultimately […]

The post 11 Best Predictive Analytics Solutions appeared first on eWEEK.

]]>
Predictive analytics software and tools help companies make data-driven decisions. They offer data analytics features that allow users to data mine large datasets and predict future outcomes – to the extent that’s possible in a rapidly changing marketplace.

These predictive analytics tools help organizations create robust models that can detect patterns, uncover trends, and ultimately provide valuable insights into their business operations. With so many predictive analytics tools in the market today, choosing the right one can be challenging.

In this article, we’ll look at 11 of the best predictive analytics tools and highlight their key features, so you can decide which is best for your business needs.

For more information, also see: Data Management Platforms

Jump to: 

What Are Predictive Analytics Tools?

11 Best Predictive Analytics Tools:

Vendor Comparison Chart

Honorary Mentions

Benefits of Predictive Analytics Tools

Key Features of Predictive Analytics Tools

How to Choose the Best Predictive Analytics Tool

What Are Predictive Analytics Tools?

Predictive analytics is a form of data analytics that uses historical data, machine learning (ML) algorithms, and artificial intelligence (AI) to predict future outcomes.

Predictive analytics tools are software used to predict future outcomes based on past data. They use statistical models, machine learning algorithms, and data mining techniques to analyze large amounts of historical data to identify patterns and trends. These patterns and trends can then predict future events or behaviors.

Predictive analytics tools can be used for various purposes, such as forecasting sales, marketing campaigns, customer segmentation, risk management, fraud detection, and operational efficiency.

For more information, also see: Top Data Warehouse Tools

11 Best Predictive Analytics Tools

​​There are many predictive analytics tools available that can help businesses make informed decisions. Below are the top 11 predictive analytics solutions that can benefit organizations:

Microsoft logo

Microsoft Azure Machine Learning: Best for Creating and Deploying Predictive Models

Microsoft Azure Machine Learning (Azure ML) is an enterprise-grade machine learning service offered by Microsoft as part of its cloud computing platform. It is designed to make it easy for data scientists, machine learning engineers, and developers to build predictive analytics models.

The tool offers automated machine learning, data science workflows, and integrated cognitive services that allow users to quickly and easily create, train, and deploy models.

Azure ML supports the end-to-end machine learning life cycle, including data preparation, model building and training, validation, and deployment. It also offers management and monitoring capabilities, allowing users to track, log, and analyze data, models, and resources.

Additionally, Azure ML enables developers to integrate their models with existing IT systems, giving them access to valuable information that can inform decisions.

Features

  • Data Labeling: Allows users to label training data and manage labeling projects.
  • Interoperability: Integrates with other Azure services such as Microsoft Power BI, Azure Databricks, Azure Data Lake, Azure Cognitive Search, Azure Arc, Azure Security Center, Azure Synapse Analytics, and Azure Data Factory.
  • Drag-and-Drop Designer: Enables users to design with a drag-and-drop development interface.
  • Support Hybrid and Multicloud Environment: Allows the user to train and deploy models on-premises and across multicloud environments.
  • Policies: Offers built-in and custom policies for compliance management.

Pros

  • Enables teams to collaborate via shared notebooks, compute resources, data, and environments.
  • Connects with other Microsoft solutions such as Excel, CSV, and Access file.
  • Supports various open-source libraries and frameworks such as Scikit-learn, PyTorch, TensorFlow, Keras, and Ray RLLib.
  • Provides governance with built-in policies and streamline compliance with 60 certifications, including FedRAMP High and HIPAA.
  • Provides security via custom role-based access control, virtual networks, data encryption, private endpoints, and private IP addresses.

Cons

  • Users report that Azure ML has few models.
  • Users report a steep learning curve.

Pricing

Microsoft Azure Machine Learning pricing is based on a pay-as-you-go model. The cost of using Azure ML depends on the instance type and the number of hours used.

For instance, the service cost may range from $0.096 to about $28 per hour, depending on the instance type. The cost of each instance type can be found on the Azure ML pricing page.

Microsoft offers discounts for long-term commitments and other special offers to reduce further the cost of using Azure ML.

IBM logo

IBM SPSS Modeler: Best for Data Mining

IBM SPSS Modeler is one of the most popular predictive analytics tools. It is a data mining and predictive analytics tool for business users, with a suite of machine learning algorithms to help uncover insights and patterns in data.

SPSS Modeler provides a range of data mining techniques, such as decision trees, neural networks, association rules, and sequence analysis, to help organizations identify patterns, relationships, and trends in their data. It also includes various predictive analytics techniques to help organizations forecast future outcomes and make better decisions. Moreover, SPSS Modeler enables users to develop and deploy predictive models in data flows.

IBM SPSS Modeler is also available within IBM Cloud Pak for Data—an enterprise-ready data and AI platform designed to help organizations unlock the value of their data, no matter where it resides—allowing users to build and run predictive models on any cloud or on-premises environment.

Features

  • Multi-Source Support: SPSS Modeler supports many data sources, including flat files, spreadsheets, relational databases, IBM planning analytics, and Hadoop.
  • Visual Analytics: SPSS Modeler includes an intuitive visual interface for creating and exploring data flows.
  • Open-Source Support: SPSS Modeler supports open-source technologies such as R, Python, Spark, and Hadoop.
  • Text Analytics: SPSS Modeler enables users to gather insights from unstructured data such as blog content, customer feedback, emails, and social media comments.
  • Geospatial Analytics: SPSS Modeler enables users to explore geographic data such as latitude and longitude, postal codes, and addresses.

Pros

  • Has a desktop application for Windows and macOS.
  • Offers drag-and-drop capabilities.
  • Includes data visualization.
  • Works on-premises and in the public or private cloud.

Cons

  • The time series and forecasting capabilities could be simplified.
  • Users report a learning curve for IBM SPSS Modeler.

Pricing

IBM SPSS Modeler offers three paid plans and a one-month free trial. However, pricing details for both the Professional and Gold plans are available on request. Potential buyers can contact the IBM sales team for custom quotes tailored to their organization’s use case.

  • Subscription: Starts at $499 per user per month.
  • Professional: Quote based.
  • Gold: Quote based.

For more information, also see: What is Big Data Analysis

H2O.ai logo

H2O Driverless AI: Best for Automation

H2O Driverless AI is a fully automated data science platform that accelerates the process of building and deploying AI-driven predictive applications. It is an end-to-end platform that automates building, optimizing, and deploying ML models.

H2O Driverless AI enables data scientists and ML engineers to build, deploy, and monitor large-scale models. It also provides a comprehensive set of features, such as automated feature engineering, automatic model selection, automated hyperparameter tuning, and automated model deployment as well as a visualization layer that helps users gain insights from their data.

Features

  • Expert Recommender System: Use AI Wizard to analyze data, receive recommendations based on  business needs, and gain instruction on the appropriate ML techniques to select based on unique data and use case requirements.
  • Automated Feature Engineering: Automatically identify the most important features and create new features based on a set of predefined rules.
  • Automated Model Selection: Select the best model for the given dataset.
  • Automated Hyperparameter Tuning: Tune the model parameters to improve its performance.
  • Automated Model Deployment: Easily deploy the model to a production-ready environment.
  • Visualization: Generate interactive visualizations to gain insights from the data.
  • Integration: Ingest data from Hadoop HDFS and Amazon S3.

Pros

  • Highly automated tool.
  • Efficient support team.
  • Support for graphics processing unit (GPU)-accelerated algorithms like XGBoost, TensorFlow, and LightGBM GLM.
  • Deployable in on-premises, hybrid cloud, and managed cloud environments.

Cons

  • Requires strong statistics and machine learning experience.

Pricing

H2O Driverless AI doesn’t advertise rates on its website. Potential buyers can contact the sales team for custom quotes. They can also request a demo to understand the product better.

SAP logo

SAP Analytics Cloud: Best for Analytics

SAP Analytics Cloud is a cloud-based analytics and business intelligence software that enables users to leverage ML and AI capabilities to create predictive models and gain real-time insights into their data.

SAP Analytics Cloud allows users to analyze data from various sources, create predictive models and interactive visualizations, and collaborate with others to create and share reports. The software is built on the SAP HANA business intelligence platform and integrates with SAP systems and other third-party applications. It can also be used to create custom applications and dashboards.

Features

  • Prebuilt Best Practice: SAP Analytics Cloud includes over 100 prebuilt best practice SAP business content packages for various businesses and industries to enhance analytics and planning projects.
  • Supported Data Sources: Users can connect to both on-premises and cloud data sources such as Google BigQuery, SAP HANA, SAP S/4HANA, SQL, SAP BusinessObjects solutions, SAP Business Planning and Consolidation, and OData.
  • What-If Simulation: Users can simulate different scenarios and visualize the impact of decisions on business outcomes.
  • Drag and Drop: SAP Analytics Cloud allows business users to design applications with low-code or no-code capabilities.

Pros

  • ​​​​Self-service data modeling and preparation.
  • Data exploration and visualization.
  • Efficient reporting capabilities.

Cons

  • Learning curve for new users.

Pricing

SAP Analytics Cloud is available on a subscription basis. Potential buyers can request custom quotes tailored to their requirements. The vendor offers a 30-day free trial with the option to extend it up to 90 days.

  • Business Intelligence: $36 per user per month.
  • Planning: Contact the sales team for a quote.

Alteryx logo

Alteryx: Best for Creating Automated Workflows for Data Analytics

Alteryx is an end-to-end self-service analytics platform that allows data analysts and business users to easily prepare, blend, and analyze data for predictive insights.

Alteryx is one of the most popular tools in the industry due to its powerful in-database engine and its ability to integrate with other systems and work with various structured and unstructured data. With this tool, users can share workflows across cloud, desktop, and on-premises environments; create interactive visualizations; and automate complex processes.

Features

  • Data Migration: Alteryx offers ELT and ETL capabilities to help make data migration processes seamless.
  • Automations: Users can automate analytics, reporting and geospatial analysis as well as data extraction from PDFs and other documents.
  • Drag-and-Drop Capabilities: Alteryx’s drag-and-drop capabilities allow users to speed up the analytics process for fast and accurate data insights.
  • Auto-Mapping: Alteryx’s data mapping AI enables users to map data to a predefined target.
  • Automated Formatting: Automatically detect and apply the format to unstructured and semi-structured datasets.

Pros

  • Connects to over 180 data sources.
  • Offers self-service data preparation.
  • Enables users to share workflows across cloud, desktop, and on-premises environments.
  • Alteryx has an active community forum with over 300 thousand members.

Con

  • Users report that the tool is pricey.

Pricing

Alteryx offers pricing tailored to individual users, teams, and organizations. Its pricing begins at $5,195 per year, and interested buyers can contact them for a custom quote. Alteryx also offers a 30-day free trial, so users can try out the platform and see if it meets their needs.

Alteryx provides several additional services and support options as well, such as custom training and onboarding and consulting services.

RapidMiner logo

RapidMiner Studio: Best for Data Science

RapidMiner Studio is a data science platform that helps users analyze, visualize, and build predictive models from their data. It includes various algorithms and tools for data mining, predictive modeling, and text mining. RapidMiner Studio also has interactive visualizations for exploring and understanding data.

The platform can be used for data mining, predictive analytics, and machine learning projects. RapidMiner Studio features a drag-and-drop interface, making it easy to use and allowing users to build complex models. It also has an open-source library that enables users to extend the tool and create custom algorithms.

Features

  • Hybrid Deployment: RapidMiner Studio can be deployed in on-premises and cloud environments.
  • Drag-and-Drop Capability: RapidMiner Studio makes analyzing and building reports fast and easy.
  • Prebuilt Automations: RapidMiner provides automated hyperparameter tuning and feature engineering.
  • Analytics Life Cycle Support: RapidMiner Studio supports data engineering, model building, model operations, AI app building and collaboration, and governance.

Pros

  • Has over 1,500 algorithms and functions for model building.
  • Offers a drag-and-drop interface.
  • Includes 10,000 data rows and one logical processor in its free edition.
  • Enables users to run simulations and what-if scenarios.
  • Enables teams to collaborate and build workflow together in real time.

Cons

  • Some users report that the solution is cost-prohibitive.

Pricing

RapidMiner Studio does not list pricing on its website. Those interested can request custom quotes by filling out the form on their website to get pricing details.

For more information, also see: Top Data Analytics Tools 

KNIME logo

KNIME: Best for Integration 

KNIME (Konstanz Information Miner) is an open-source data analytics platform for data mining, machine learning, predictive analytics, and business analytics.

KNIME features a graphical interface and provides various data analysis tools and options and an extensive library of algorithms for machine learning and predictive analytics. It is used for data-driven decision-making and can be used for data preprocessing, data analysis, and data visualization.

Features

  • Blend and Transform: KNIME enables users to connect to databases and data warehouses, access various file formats, and retrieve data from cloud resources or external services.
  • Data Analytics Support: KNIME offers support for various data analytics functions, such as classification, regression, dimension reduction, and clustering, as well as advanced algorithms like deep learning, tree-based methods, and logistic regression.
  • Prebuilt Automations: KNIME offers prebuilt automated components for feature engineering and selection, hyperparameter optimization, and model interpretability to enable fast prototyping and testing.
  • Open-Source Support: KNIME integrates with open-source projects such as Keras or Tensorflow for deep learning, H2O for high-performance machine learning, and R and Python for coding.
  • Governance: KNIME provides governance capabilities such as versioning, documentation, administration, and monitoring.

Pros

  • Has over 300 connectors to data sources.
  • Is deployable on-premises and in the cloud.
  • Connects various databases and data warehouses tools, such as SQL Server, Postgres, MySQL, Snowflake, Redshift, and BigQuery.
  • Enables users to visualize data with customizable bar charts and scatter plots as well as advanced charts, parallel coordinates, sunbursts, network graphs, and heat maps.

Cons

  • There is a steep learning curve.

Pricing

KNIME offers various pricing plans for different categories of users.

KNIME Community Hub (Hosted by KNIME)

  • Personal and Individual: Free.
  • Team: Starts from $285 per month.

Business Hub (Managed by the customer)

  • Standard plan (up to three teams): Contact the sales team for a quote.
  • Enterprise plan (unlimited teams): Contact the sales team for a quote.

TIBCO logo

TIBCO Statistica: Best for Windows Machines

Statistica is an advanced analytics software designed by StatSoft and maintained by TIBCO. It provides predictive analytics, data mining, machine learning, forecasting, optimization, and text analytic model capabilities to help organizations make decisions based on data.

Statistica is used in many different sectors, including banking, communications, energy, healthcare, insurance, life sciences, manufacturing, media, retail, and transportation. These industries use it to make better data-driven decisions with insights into customer behavior, product performance, market trends, and more.

Statistica offers many features and capabilities, including data preparation, visualization, predictive analytics, optimization, forecasting, and model building. It is designed to be used by both data scientists and business users, providing simplified tools and wizards to create and deploy models.

Features

  • Python and R Support: Allows users to use embedded code nodes to integrate open-source languages and libraries.
  • Data Visualization: Provides dynamic visualizations to help with making data-driven insights.
  • Rules Builder: Enables users to integrate business rules with data analytics.
  • Data Cleaning: Provides tools to filter good data for analytics and insights.
  • Data Preparation and Wrangling: Enables users to easily arrange and structure data for analytics.
  • Feature Engineering: Allows users to transform data into usable features.
  • Supports ML and Data Mining Models: Includes support for neural networks, decision trees, and support vector machines.
  • Integrations: Connects to marketplaces such as Azure ML, Algorithmia and Apervita.

Pros

  • Has rules builder to integrate business rules with analytics.
  • Enables users to build analytics, dashboards, and reports.
  • Is deployable on-premises or in the cloud.

Cons

  • Steep learning curve.

Pricing

Statistica doesn’t advertise pricing on its website. Those interested should contact the sales team for custom quotes or take advantage of its 30-day free trial.

SAS logo

SAS Advanced Analytics: Best for Analyzing Unstructured Data

SAS Advanced Analytics is a suite of software tools and applications used to create and deploy predictive models and data-driven solutions. This comprehensive package of software products provides a comprehensive set of tools to develop, deploy, and analyze predictive models and specialized tools to optimize business processes and strategies.

SAS Advanced Analytics includes a range of capabilities, including data mining, machine learning, text analytics, forecasting, optimization, and simulation. It also provides tools for creating data visualizations and dashboards. SAS Advanced Analytics is designed to enable organizations to use their data to gain insights, make better decisions, and optimize business operations.

Features

  • Optimization and Simulation: Featured Optimization and Simulation software help identify actions and build models to help drive decision-making on insights.
  • Data Mining: SAS Advanced Analytics enables users to simplify data preparation, quickly and easily create better models, and put their best models into service.
  • Data Preparation: Users can identify key relationships in data using data preparation to filter good data.
  • Data Visualization: SAS Advanced Analytics’ dynamic charts and graphs allow users to identify key insights in their data.
  • Data Science Development and Modeling: Drag-and-drop capabilities and automated, interactive processes help take the guesswork out of building data models.
  • Text Analytics: SAS Advanced Analytics automates the process of reading, organizing, and extracting useful details from datasets.
  • Statistical Analysis: SAS Advanced Analytics offers powerful algorithms covering multiple forms of analysis to help users build and customize data-driven reports.
  • Forecasting and Econometrics: Users can generate large quantities of high-quality forecasts quickly and automatically.

Pros

  • Interactive dashboard.
  • Drag-and-drop interface.
  • Abilityto access and analyze data in-memory and in-stream.

Cons

  • Complex initial setup process.

Pricing

SAS Advanced Analytics pricing is not available publicly. However, a 14-day free trial is available with the option to extend. For more pricing information, organizations can request a demo or contact the SAS sales team for a custom quote.

Oracle logo

Oracle Data Science: Best for Data Science Teams

Oracle Data Science (also known as Oracle Cloud Infrastructure Data Science) is a fully-managed platform that enables data science teams to build, train, deploy, and manage machine learning models using Python and open-source tools.

Oracle Data Science provides an intuitive, collaborative environment powered by JupyterLab, so data scientists can experiment and develop models. Data scientists can scale up model training with support for popular machine learning libraries, such as TensorFlow and PyTorch, and powerful NVIDIA GPUs for distributed training.

Additionally, MLOps capabilities, such as automated pipelines, model deployments, and model monitoring, help keep models healthy in production.

Features

  • Data Preparation: Offers access to data in the cloud or on-premises and apply labels to data records.
  • Model Building: Provides built-in, cloud-hosted JupyterLab notebook environments and supports open-source machine learning frameworks such as TensorFlow and PyTorch.
  • Model Training: Allows data scientists to build and train deep learning models using NVIDIA GPUs.
  • Governance and Model Management: Includes a model catalog, evaluation and comparison, explanation, reproducible environments, and version control.
  • Automation and MLOps: Includes managed model deployment, automated pipelines, and ML monitoring and applications.

Pros

  • The tool is highly customizable and scalable.
  • Intuitive user interface and interactive dashboard.

Cons

  • Users report that this tool is expensive.

Pricing

Oracle Data Science is a pay-as-you-go service, so rates depend on the compute engine product and consumption. Prices are based on the number of OCPUs per hour, performance units per gigabyte per month, gigabyte storage capacity per month, load balancer hour, Mbps per hour, and GPU per hour.

For example, the rate for Compute — Virtual Machine Standard — X7 is $0.0319 per OCPU hour, or $0.0638 unit price. For the load balancer base, the unit rate is $0.0113 load balancer hour. Visit the Oracle Data Science pricing page for more details.

Google Cloud logo

Google Cloud AutoML: Best for Building Custom Machine Learning Models

Google Cloud AutoML is a suite of machine learning products that enables developers with limited machine learning expertise to train high-quality models specific to their business needs. It is designed to make machine learning more accessible and easier to use.

Google Cloud AutoML offers tools that automate various aspects of the machine learning process, from training to deployment. It also provides tools for data labeling, model building, and model evaluation. These tools allow users to develop and deploy custom ML models with minimal effort.

Features

  • ML Training: Trains datasets over a terabyte.
  • Application Programming Interface (API) Integration: Uses REST and RPC APIs and integrates REST and gRPC APIs.
  • Language Support: Supports 50 language pairs.
  • Data Preparation and Storage: Prepares and stores datasets for analytics.

Pros

  • Enables developers with limited machine learning expertise to create custom machine learning models.
  • Provides streaming video analysis.
  • Trains terabyte-sized datasets.

Cons

  • Not deployable on-premises.

Pricing 

Google doesn’t advertise this product pricing on its website. For custom quotes, contact the sales team.

However, Google Cloud AutoML offers users a free trial, allowing them to explore the various products and features of the suite. The free trial includes $300 worth of free credits, which can be used to explore the different Google Cloud products and services available.

Additionally, those working with organizations are eligible for an additional $100 in credits, totaling up to $400, to explore Google Cloud products for 90 days. Through the free trial, users can gain hands-on experience with the platform and explore the various features of the suite.

For more information, also see: The Data Analytics Job Market 

Top Predictive Analytics Tools Comparison Chat

Product Best for Pricing Drag-and-drop capabilities Deployment
Microsoft Azure Machine Learning Creating and deploying predictive models Starts at $0.096 per hour Yes Hybrid and multicloud environment
IBM SPSS Modeler Data mining Starts at $499 per user per month Yes On-premises, public and private cloud
H2O Driverless AI  Automation Not available Yes On-premises, hybrid cloud, and managed cloud
SAP Analytics Cloud  Analytics Starts at $36 per user per month Yes On-premises and cloud
Alteryx Creating automated workflows for data analytics Starts at $5,195 per year Yes On-premises and cloud
RapidMiner Studio  Data science Not available Yes On-premises and cloud
KNIME  Integration Starts at $285 per month Yes On-premises and cloud
TIBCO Statistica Windows machines Not available Yes On-premises and cloud
SAS Advanced Analytics Analyzing unstructured data Not available Yes On-premises and cloud
Oracle Data Science Data science teams Pay-as-you-go service Yes On-premises and cloud
Google Cloud AutoML Building custom machine learning models Not available Yes Cloud

Honorary Mentions

Here are a few other products worth mentioning when considering the best predictive analytics tools:

  • Q Research
  • Minitab
  • Anaconda Enterprise
  • DataRobot
  • Dataiku DSS
  • GoodData
  • MicroStrategy Analytics
  • Logi Info (Logi Analytics Platform)
  • MathWorks
  • RStudio
  • Domino Data Lab
  • Sumo Logic
  • FICO Xpress Insight
  • Amazon Machine Learning

Benefits of Predictive Analytics Tools

Predictive analytics tools provide several benefits to organizations. Here are some of the advantages that predictive analytics tools offer.

Improved Decision-Making

Predictive analytics tools enable businesses to make decisions based on data rather than gut feeling or intuition. This helps ensure decisions are well-informed and backed by evidence, leading to better results and return on investment (ROI).

Increased Efficiency

By streamlining processes, predictive analytics tools can help businesses increase efficiency and reduce costs. Automating, collecting, organizing, and analyzing data eliminates manual labor and cuts down on errors.

Enhanced Customer Experience

With predictive analytics, businesses can better understand their customer base and how they interact with their products or services. This helps companies tailor their offerings to meet customer needs, leading to greater customer satisfaction and loyalty.

Fraud Detection

Predictive analytics tools can detect fraudulent activity in various areas, such as credit card payments or financial transactions. By recognizing patterns and unusual behavior, businesses can stay one step ahead of criminals.

Risk Mitigation

Predictive analytics tools can help businesses identify potential risks before they arise, allowing them to plan for any eventuality and manage their resources accordingly. This is especially useful for industries prone to risks, such as banking or insurance.

Businesses can achieve tremendous success and competitive advantage in today’s ever-evolving market by taking advantage of the benefits that predictive analytics tools offer.

Key Features of Predictive Analytics Tools

The best predictive analytics tools have the following key features:

  • Automation: Predictive analytics tools automate data collection, analysis, and reporting, making it easier to spot trends and insights.
  • Machine Learning Algorithms: Predictive analytics tools use advanced ML algorithms to analyze and identify patterns, trends, and relationships for accurate predictions about future events or outcomes.
  • Data Mining Capabilities: These data mining tools allow users to uncover hidden patterns and correlations in large datasets, which can be used to make more accurate predictions.
  • Visualization: Visuals such as dashboards and charts allow users to identify trends and patterns in their data for insights into data that can help users make better decisions.
  • Customized Models: This enables users with unique datasets and needs to create models tailored to their specific application, ensuring accuracy and relevance.
  • Forecasting: Predictive analytics tools can generate forecasts and predictions. This helps organizations plan for future events and anticipate potential risks.
  • Management, Monitoring, and Reporting:  Predictive analytics tools allow users to manage and monitor their models and data sources over time to track performance over time and ensure the predictions being made remain up-to-date and relevant.

How to Choose the Best Predictive Analytics Tool

There are several factors to consider when shopping for the best predictive analytics tool for your organization.

Purpose of the Tool

The first step in choosing the best predictive analytics tool is to determine the purpose of the tool. What type of predictive analysis are you looking to do? Does the tool need to predict customer behavior, such as churn rate, or do you need something more complex, like finding trends in customer data? Knowing the tool’s purpose will help in narrowing down the options and focusing on the tools that will best meet your needs.

Data Type

Consider what data needs to be analyzed. Depending on the type of data, you may need a tool specializing in big data or one offering a more comprehensive view. Additionally, think about the data’s structure. Is it structured, semi-structured, or unstructured? This can be a major factor when determining which tool to use.

Features

Assess the features of the available tools. Some essential features to consider are the ability to handle large data sets, customize algorithms and parameters, data visualization capabilities, data analysis, visualization, reporting, integration with other applications, and ease of use.

Depending on the complexity of a project, you may need specific capabilities such as text analytics, natural language processing, and deep learning. Many predictive analytics tools offer a variety of features, so it is vital to assess each one to determine which best fits your needs.

Consider Your Resources

It is essential to consider the resources available when choosing the best predictive analytics tool. This includes the amount of time and money available as well as the skills and experience of your team. Different tools offer different levels of complexity, so choosing the tool that best fits your resources is essential. Additionally, many tools offer support services that could be beneficial.

Test and Evaluate

After narrowing down your choices, the final step is to test and evaluate the different tools. Many tools offer free trials or demos, so take advantage of these to better understand how the tool works and fits into your project.

Additionally, reading customer reviews can help understand the pros and cons of each tool. Ultimately, the best way to determine which tool is best is to test and evaluate each tool to ensure it meets your needs.

Get Support

Finally, ensure you have access to support when choosing a predictive analytics tool. Many tools offer customer support services and training to help you get started and ensure you use the tool to its fullest potential. Additionally, having access to support can be invaluable if you have any questions or need assistance.

The post 11 Best Predictive Analytics Solutions appeared first on eWEEK.

]]>
Top Business Process Management Companies https://www.eweek.com/enterprise-apps/business-process-management-companies/ Tue, 24 Jan 2023 22:44:09 +0000 https://www.eweek.com/?p=221848 Business process management is the coordination of staff and computing systems to produce advantageous business outcomes. Along with traditional processes for accounting and finance, there are often manufacturing and supply chain processes to take into account. Factor in on-premises systems, the cloud, digital transformation and a myriad of applications, and the picture gets far more […]

The post Top Business Process Management Companies appeared first on eWEEK.

]]>
Business process management is the coordination of staff and computing systems to produce advantageous business outcomes.

Along with traditional processes for accounting and finance, there are often manufacturing and supply chain processes to take into account. Factor in on-premises systems, the cloud, digital transformation and a myriad of applications, and the picture gets far more complex.

Hence, business process management (BPM) software has become vital in many businesses to keep track of everything. 

Jump to: 

BPM Trends

Increased Adoption of Cloud-Based Platforms

As more and more companies move their operations to the cloud, there is a growing demand for cloud-based platforms that can manage and automate business processes. These platforms are often more cost-effective and easier to scale than on-premises solutions.

Artificial Intelligence and Machine Learning

There is a growing interest in using artificial intelligence (AI) and machine learning to automate routine human tasks and make more accurate predictions and decisions. This can help companies become more efficient and improve the overall customer experience.

“With the release of GPT, AI continues to make broad impacts in business operations, and the demand to incorporate AI into process orchestrations will continue to rise,” said Malcolm Ross, senior vice president of product strategy at Appian.

Low-Code and No-Code Platforms

There is a growing need for platforms that allow nontechnical users to create and manage business processes without needing to write code. Low-code and no-code platforms have become increasingly popular, as they enable businesses to quickly create and deploy automation solutions.

“A low-code/no-code approach lets you build operational excellence without constant third-party developer involvement,” said Michael Donaghey, vice president of sales at CMW Lab. “Citizen and semi-professional developers can make changes at a faster pace at the same time, reducing a company’s workload, delivery time, and costs.”

Increased Automation of Customer Service and Support

Companies are increasingly turning to automation to handle customer service and support tasks such as answering frequently asked questions, scheduling appointments, and handling customer complaints.

“This helps to reduce the workload on human customer service representatives and can improve the overall customer experience,” said Ross.

Robotic process automation (RPA) is a mainstream technology for automating repetitive manual processes. Buyers are looking for larger automation platforms that simply incorporate RPA and allow them to easily orchestrate processes that traverse humans, AI, systems, and digital RPA workers.

“BPM for automation is evolving beyond just modeling and designing processes for automations,” said Tony Higgins, chief product officer at Blueprint Software Systems. “Solutions are available that enable automation programs to completely understand their RPA estates with robust analytics.”

Greater Focus on Process for Regulatory Compliance

As more sensitive data is shared and stored electronically, there is a growing need for secure, compliant automation solutions. This is especially true for companies that operate in regulated industries such as healthcare and finance. BPM helps with this. 

Data fabric is a rapidly evolving foundational layer that can drive intelligent process decisions and routing. Data fabric technology is designed to integrate data from multiple systems into a single and easily managed virtualized data model.

“This is important to ensure AI can be trained with a complete view of data, and business processes can easily traverse data silos,” said Ross.

End-to-end process orchestration moves a business activity or project through its entire life cycle, pulling in and processing data through bidirectional integrations. It goes beyond simple automation of repetitive tasks to tie together human activities with digital workers.

“By using a workflow that represents an entire business process from start to finish, more strategic value is created. More complex business logic is supported, and stakeholders have real-time visibility into status and KPIs (key performance indicators,” said Joe LeCompte, CEO and principal at PMG.

Also see: Top Digital Transformation Companies

How to Select a BPM Solution

Having a BPM platform that is optimized to your particular business offers significant competitive advantages.

As such, BPM platforms should include the following core elements:

  • Graphical business process or rule-modeling capabilities.
  • A process repository to handle modeling metadata.
  • A process execution engine /rule engine.

Some systems remain on-premises, but increasingly, cloud functionality is needed due to so many processes residing in the cloud. There is also a wide difference found in the size and scope of market offerings, and platforms can be either basic or sophisticated systems.

The other key factor: how intuitive is the user interface? BPM solutions are necessarily complex, and if only highly trained individuals will be using yours, the user interface can be correspondingly complex. Yet if you expect wider user in your company, look for a more intuitive user interface.

Top BPM Solutions

eWeek evaluated the many BPM solutions available on the market. Here are our top picks, in no particular order.

Appian logo

Appian

Best for: Companies seeking a low code / no code solution.

The Appian Platform is used to design digital software solutions, automate tasks to drive efficiency, and optimize business process operations. Its low-code approach simplifies the visualization and building of new applications and the establishment of workflows. It provides visibility to drive continuous optimization.

Key Differentiators

  • All of Appian’s design time experiences are low and no code for faster time to solution and value.
  • The platform includes all components necessary to deliver end-to-end process automations and reduce IT complexity and maintenance.
  • Visualization capabilities are easy for users to implement.
  • Appian helps organizations build new applications and workflows.
  • Automation capabilities are available across AI, individual development plans (IDP), RPA, and application programming interface (API) integration.
  • Appian offers integrated process mining and analytics.

CMW Lab logo

CMW Platform

Best for: companies who want unified process automation.

The CMW Platform by CMW Lab is a low-code BPM suite built for unified process automation. It can be used for digital transformation. Building a process management system with CMW Platform is said to reduce development costs and lower the IT workload.

Key Differentiators

  • It provides quick deployment on-premises, in private or public cloud.
  • It uses a multi-tier hybrid software-as-a-service (SaaS) architecture and graph technologies to boost reliability and flexibility.
  • Users can see first results within days.
  • CMW Platform integrates scattered processes into one reliable system using API integrations, drag-and-drop capabilities, and other patented features on process management.
  • CMW Platform is a customizable web-based platform to make changes on the go without developers.

Blueprint logo

Blueprint Software Systems

Best for: Companies that want to want to visualize their RPA deployment.

Blueprint enables organizations to visualize and understand their RPA estate, identify where there is waste and remove it, find retirement opportunities to reduce costs, become more efficient, and optimize any complex automations by refactoring them. It simplifies the process of migrating to another RPA platform.

Key Differentiators

  • Users can migrate their RPA estates automatically with 60–75% time and cost savings when compared to manual migrations.
  • Blueprint Software Systems ingests current automations, providing analytics and insight via dashboards about the complexity of automations and how many applications they interact with as well as the actions, variables, and decision branches of each process.
  • Blueprint Software Systems maps RPA estates into a common object model and makes processes automatically compatible with other RPA platforms.
  • Analytics help organizations better understand, optimize, and reduce operating costs.

Also see: Best Data Analytics Tools 

PMG logo

PMG

Best for: Companies that want a suite of dev tools.

The PMG Platform is a low-code process orchestration platform offering a suite of development tools to create business solutions for efficiency and a better user experience. Its workflow engine automates tasks, manages human activities, applies business logic, and transacts bidirectional integrations, supporting both short- and long-running processes.

Key Differentiators

  • PMG’s API Builder configuration tool provides a way to augment off-the-shelf applications and solutions that aren’t easily customizable.
  • If solution requirements include an end-user interface, it offers drag-and-drop configuration of portal and dashboard pages.
  • As a low-code platform, businesses can leverage snippets of code they already use or ensure they have the flexibility to do so without the need for specialized tech staff.
  • PMG’s Workflow Anywhere gives users the ability to run a workflow from anywhere within the platform or externally by using API Builder.
  • The PGM Relay Framework offers enterprises the ability to run workflows, including PowerShell, outside of their firewall, delivering workflow capabilities while still meeting more stringent internal security guidelines.

Nintex logo

Nintex

Best for: Companies that are focused on ease of use.

The Nintex Process Platform is a low-code process platform that helps companies discover, automate, and optimize business processes to drive growth. It offers intelligent forms, advanced workflow, digital document generation, e-signatures, RPA, process discovery technology, and a process management solution.

Key Differentiators

  • Nintex Process Platform provides ease-of-use coupled capabilities for managing and automating processes.
  • The platform is optimized for less technical users, like the operations and process owners intimately familiar with the processes that need to be automated and managed.
  • The platform is focused on process management, automation, and improvement.
  • Nintex Process Platform helps users define the processes that make the most impact to the organization.
  • It provides simple process maps in collaboration with others in organization.
  • It manages process participants to drive continuous process improvements.

Zvolv logo

Zvolv

Best for: Companies that want to boost hyperautomation.

Zvolv is a low-code unified platform aimed at driving hyperautomation for enterprises. It helps to accelerate digital transformation by streamlining processes across the organization with the combined use of no-code and low-code application development, automation, integrations, and analytics.

Key Differentiators

  • Zvolv integrates decision-making automation and orchestration of processes across systems.
  • The platform helps organizations tackle last-mile intelligent automation challenges that existing enterprise resource planning (ERP), BPM, or RPA tools cannot.
  • An automation bot, low-code editor can be used by developers to enhance the application with complex use case definitions.
  • Dynamic dashboards, reports, and drill-down analytics are available for decision-makers.

Also see: Top Data Visualization Tools 

Oracle logo

Oracle BPM

Best for: large enterprise organizations.

Oracle Business Process Management Suite 12c is designed to make things simple for business users via a web-­based composer that allows them to model, simulate, optimize, deploy, and execute business processes. It also provides business-­friendly mobile and web applications as well as out-­of-­the-box process and case analytics.

Key Differentiators

  • Oracle BPM offers the ability to manage by exception.
  • The platform allows the modeling of structured and unstructured processes.
  • Oracle BPM is a unified platform that spans systems, decisions, documents, and events.
  • A light­weight Business Architecture modeling tool in the Business Process Composer provides a blueprint of the enterprise and gives a common understanding of the organization.
  • Oracle BPM helps to align an organization’s goals and objectives and strategies with the actual projects that are being undertaken.

Bizagi logo

Bizagi BPM

Best for: Companies that want a common language between business and IT.

Bizagi BPM enables organizations to model, design, automate, and manage every business process on a single low-code platform. It provides the process insight and control to deliver value to the business.

Key Differentiators

  • Bizagi BPM automates dynamic and complex processes enterprise-wide.
  • It develops a common language between business and IT departments for faster development of applications.
  • Shares, reuses, and adapts process elements to rapidly respond to changes in the market.
  • Users can access real-time and historical reports to monitor business process performance and identify opportunities.
  • Bizagi BPM connects modern applications, databases, or legacy systems to provide a centralized view of business data.

Also see: Top Business Intelligence Software 

The post Top Business Process Management Companies appeared first on eWEEK.

]]>
The AI Market: An Overview https://www.eweek.com/big-data-and-analytics/ai-market/ Mon, 09 Jan 2023 23:54:46 +0000 https://www.eweek.com/?p=221799 If there’s a leading technology of the current era, artificial intelligence (AI) is clearly a top contender. The hype is constant and flows from all quarters. AI’s role in consumer products and enterprises alike is growing, rare for any technology. AI as a platform spans hardware, software, and on-demand services. All three categories have very […]

The post The AI Market: An Overview appeared first on eWEEK.

]]>
If there’s a leading technology of the current era, artificial intelligence (AI) is clearly a top contender. The hype is constant and flows from all quarters. AI’s role in consumer products and enterprises alike is growing, rare for any technology.

AI as a platform spans hardware, software, and on-demand services. All three categories have very different players, although there is some overlap between hardware and software players.

The number of U.S. AI companies has doubled since 2017. According to Tracxn Technologies, which tracks startup businesses, as of the third quarter of 2022, there are 13,398 artificial intelligence startups in the United States.

IDC predicts the worldwide AI market, including software, hardware, and services, will grow from $327.5 billion in 2021 to $554.3 billion in 2024 with a five-year compound annual growth rate (CAGR) of 17.5%.

Also see: What is Artificial Intelligence 

What is AI?

Because it is so widely used, AI has become tricky to define. Ask ten people to define AI and you will likely get ten different variations. IT consultancy Gartner defines it as “applying advanced analysis and logic-based techniques, including machine learning (ML), to interpret events, support and automate decisions and to take action.”

This definition coincides with the current state of AI technologies, which is to say that AI performs data analysis and conducts actions based on the findings of that analysis. Analytics has been around for quite some time but it tends to use smaller data sets and offers smaller, less elaborate results. AI handles larger datasets and offers more probabilistic outcomes.

Clearly, there is a wide range of ways in which artificial intelligence can be used. Not to mention the many subdivisions of AI, such as machine learning, deep learning, and natural language processing (NLP).

Therefore, it is incumbent on customers of AI vendors to ask them their definition of AI, and how their offerings will meet your expectations. You have to make sure that their vision of AI aligns with your business needs.

Also see: Top AI Software 

AI Hardware

According to the market research firm Tractica, the global AI-driven hardware market is in the process of growing from a mere $19.63 billion back in 2018 to an expected $234.6 billion by 2025. The AI-driven hardware market includes categories such as CPU, GPU network products and storage devices.

When it comes to AI hardware, the real players are chipmakers, because AI processing is vastly different from typical application processing using CPUs. For the most part, that involves GPU makers, but in recent years there have been startups using new chip designs specifically geared toward AI processing in the hopes of being more efficient and faster than GPUs.

The leading vendor in AI hardware is GPU maker Nvidia. It has repurposed chips normally used to accelerate video games as AI processors, working much faster than an x86 CPU. AMD was not much of a player in this field for a long time because it was struggling to survive, but it has made a remarkable comeback in recent years and is now making serious inroads in the AI and high-performance computing (HPC) market.

Intel is also finally finding its footing in the AI space. It has an inference processor, called Goya, and a processor specifically for self driving cars called Mobileye, as well as its Altera FPGA line for training processing. But it never could quite get the GPU product right until now. Its Xe architecture will be sold under the Arc brand name for consumer GPUs while the AI/HPC product will be known as Ponte Vecchio.

All of the major server vendors – the top brand names like HPE, Dell, and Lenovo as well as vendors such as Supermicro, Wiwynn, and Inspur – all have AI-oriented hardware using chips from Intel, AMD, and Nvidia.

Also see: AI vs. ML: Artificial Intelligence and Machine Learning

AI Software

Getting an accurate measure of the overall AI software market is challenging, because many general purpose applications have AI in them – this leads to the question of whether they should be considered AI software or merely software with AI capabilities.

If we go with the latter, the overall market is massive because it breaks down into so many different categories, each with multiple competitors.

And it’s not always who you might think is a leader. One of the biggest AI software vendors is Nvidia, which we’ve already mentioned in the hardware category. The company often boasts that it has more software engineers than hardware engineers, making AI software to run on their GPUs.

But there are many other vendors. Gartner estimates worldwide AI software revenue was $62.5 billion in 2022, an increase of 21.3% from 2021.

Also see: The Future of Artificial Intelligence

AI-as-a-Service

AI is also being made available as a service, just like software, infrastructure, platform, and other on-demand services through cloud service providers. AI-as-a-service has an appeal to many midsized and smaller enterprises because it means that they don’t have to make the massive investment in AI hardware.

AI hardware is extremely powerful. It’s also extremely expensive. The only real need for horse power is in the training segment. The inference portion of AI, which is where it will mostly be used, does not require high performance computing. A company may perform algorithm training just a few times a year, but then run inferencing against those algorithms as part of business.

That means a company’s expensive AI training hardware, which can easily run into the six and seven figures, will sit idle for long periods because it’s not needed. So why buy when you can rent for the short period you need it? Using AI-as-a-Service, a company on a budget can do the expensive training portion through a cloud provider for much less than the cost of investing in the hardware.

AI-as-a-Service is provided by the top cloud hyperscalers: AWS, Microsoft, Google, and in particular IBM. IBM has lagged behind the other major cloud vendors in overall cloud market share, but it has made a significant AI effort with IBM Watson Cloud. First, it allows companies to make AI a part of their existing applications to make more accurate predictions, automate the decision making processes, and get optimized solutions.

Watson has a number of pre-built applications, such as Watson Assistant, Watson Speech to Text, and Watson Natural Language Understanding. IBM Watson Cloud also provides AI solutions for specific markets such as AI for Customer Service, AI for Financial Services, and AI for Cybersecurity.

Also see: How AI is Altering Software Development with AI-Augmentation 

Toward an AI Strategy

For a business to truly gain the benefits of AI, it should be deployed enterprise-wide, because the benefits of AI can be most fully realized across virtually every department in the company. Gartner says a proper AI strategy identifies use cases, quantifies benefits and risks, aligns business and technology teams and changes organizational competencies to support AI adoption.

The first step is to focus on what your organization is trying to accomplish and the business problems you’re working to solve. AI does not have to be for new applications only. It can be made part of your existing suite of applications, as IBM is trying to do with Watson.

However, it should be done slowly with great deliberation, for two reasons: first, there’s a learning curve inherent in every new technology. No matter how talented the IT staff, they are still going to need time to grasp all of the fundamentals of AI programming and integration in with their applications.

The second reason is that Gartner has noted that organizations experiment with AI but often struggle to make the technology a part of their standard operations. That’s because AI is still in its early stages and the maturation process cannot be rushed. Gartner predicts that it will take until 2025 for half of organizations worldwide to reach what Gartner’s AI maturity model describes as the “stabilization stage” of AI maturity or beyond.

The Future of AI 

The future of looks to be more, faster, and larger investments. Clearly there will be many more use cases for AI, many more applications. The hardware will get faster, leading to more powerful AI systems. And the data sets will get bigger, meaning more complex AI applications in the future.

Beyond the usual speeds and feeds will be the next big step in AI, known as artificial general intelligence, or AGI. Whereas AI does what is programmed to do, AGI has initiative. It asks questions that were not part of its programming and acts upon the.

That may frighten some people. From “2001: A Space Odyssey” to “The Matrix,” fear of AI coming to life and turning on humanity has been around for decades. Hopefully, the benefits will be obvious and overcome any fears induced by popular culture.

The post The AI Market: An Overview appeared first on eWEEK.

]]>