eWEEK Staff, Author at eWEEK https://www.eweek.com/author/eweek-staff/ Technology News, Tech Product Reviews, Research and Enterprise Analysis Sat, 22 Feb 2025 05:53:23 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 2025 AI Trends Driving the Biggest Tech Transformations Today https://www.eweek.com/artificial-intelligence/ai-trends/ Fri, 21 Feb 2025 20:00:00 +0000 https://www.eweek.com/?p=222730 AI trends include the growth of generative AI, the rise of autonomous vehicles and greater focus on ethics and compliance.

The post 2025 AI Trends Driving the Biggest Tech Transformations Today appeared first on eWEEK.

]]>
AI is set to reshape the tech landscape in 2025, driving breakthroughs in enterprise IT, automation, cybersecurity, and software development. As AI frameworks evolve and tech stacks become more sophisticated, IT leaders must navigate a rapidly shifting environment where generative AI, autonomous AI agents, and quantum computing redefine business operations.

We will explore these key AI trends fuelling the most impactful tech transformations, giving IT executives the insights to adapt, innovate, and safely navigate the changes.

KEY TAKEAWAYS

  • AI is evolving from a support tool to an autonomous system, creating new roles like AI Workflow Engineers while automating jobs like IT support and network monitoring.
  • RAG improves AI decision-making by retrieving real-time IT, cybersecurity, and customer service data.
  • Multimodal AI processes multiple data types at once, improving cybersecurity, IT automation, and business intelligence.
  • AI frameworks like LangGraph and CrewAI automate workflows while evolving tech stacks to improve scalability.
  • With IBM and Google leading innovations, quantum computing boosts AI in simulations, finance, and cybersecurity.

1. Agentic AI is reshaping IT jobs

Agentic AI isn’t just assisting IT teams anymore — it’s making decisions independently. If you work in IT, this shift will impact you directly. Agentic AI is evolving from a tool to an autonomous system, meaning your role could change significantly.

By 2028, Gartner predicts AI agents will be embedded in 33 percent of enterprise applications, up from less than 1% in 2024. Some IT jobs will disappear, while new opportunities will emerge.

You will see more of these IT jobs

  • AI Workflow Engineers: AI needs customization. According to Gartner, companies will need specialists to train and fine-tune AI workflows to match business goals.
  • AI Governance Specialists: As AI takes over decision-making, someone has to set the rules. According to Deloitte, AI Governance Specialist roles will be critical in compliance and risk management.
  • Prompt Engineers and AI Interaction Designers: AI’s output is only as good as the input it gets. If you understand how to structure AI prompts and refine interactions, this skill set will be in high demand.
  • AI Ethics Officers: AI is increasingly used in hiring, finance, and compliance. Someone needs to ensure AI makes fair, unbiased decisions, and that could be you.

You will see fewer of these IT jobs

  • Entry-level IT Support: AI chatbots and self-healing IT systems handle troubleshooting, password resets, and help desk tasks, reducing demand for IT support staff.
  • Network Monitoring Specialists: AI can now analyze and fix network issues in real time, eliminating the need for manual monitoring.
  • Essential System Administrators: AI is automating systems updates, optimizing cloud resources, and detecting issues before humans notice them.
  • Entry-level IT Compliance Analysts: AI is already tracking regulatory changes and automating compliance updates, reducing the need for junior compliance analysts.
  • Basic QA Testers: AI-driven testing tools can detect bugs and correct errors automatically, making manual testing less necessary.

How to prepare for the AI workforce shift

If your job falls into one of these categories, it may be time to upskill and adapt; focusing on AI auditing, workflow automation, and security risk assessment will keep you relevant. Even though AI takes on more responsibilities, human oversight is critical, especially in compliance, ethics, and risk management.

Rather than viewing AI as a threat to your career, think of it as a collaborator. Companies that successfully integrate AI alongside their workforce, not as a replacement, will see the biggest benefits. If you invest in learning how to work with AI, you’ll be ahead of the curve. The key isn’t just adopting AI, but making sure you’re equipped to work with it.

2. Edge AI benefits real-time decision-making across industries

Edge AI processes data locally on devices, sensors, or industrial systems, unlike traditional AI models that rely on cloud computing; this eliminates delays, reduces bandwidth use, and improves efficiency. McKinsey highlights edge AI is essential for industries that depend on real-time decision-making, where waiting for cloud processing isn’t an option.

How edge AI is changing the game in different sectors

  • Manufacturing is cutting downtime by using edge-AI-powered sensors to detect defects, optimize production, and predict equipment failures in real-time. With AI-driven maintenance, factories can reduce costly disruptions.
  • Healthcare is becoming more responsive as edge AI enables real-time patient monitoring, emergency response, and faster diagnostics. AI-embedded medical devices detect anomalies instantly, reducing reliance on cloud-based systems.
  • Autonomous vehicles rely on edge AI for safety. Self-driving cars process road conditions, obstacles, and traffic changes instantly using AI on board instead of depending on cloud servers. This minimizes the risk of accidents caused by network delays.
  • Retailers are enhancing customer experiences with edge AI-powered inventory racking, cashier-less checkout, and personalized shopping. Retailers can cut costs and improve supply chain efficiency by processing data locally.
  • Cybersecurity is becoming more proactive as edge AI detects threats instantly at the device or network level, preventing breaches before they spread. This real-time analysis reduces reliance on cloud-based security systems and strengthens data protection.

Edge AI is all about real-time, local processing; agentic AI takes it a step further by acting autonomously. While edge AI enables fast, on-device processing, agentic AI focuses on autonomy and long-term decision-making, making them complementary rather than interchangeable. As AI evolves, IT leaders must understand where each fits into their technology strategy to maximize efficiency and automation.

3. Retrieval-augmented generation enhances IT leaders’ decision-making

AI agents are everywhere, analyzing data, predicting outcomes, and making decisions without human intervention. According to a recent poll at The Wall Street Journal’s CIO Network Summit, 61 percent of IT executives are experimenting with AI agents, but 21% haven’t adopted them yet. One of the biggest reasons? Trust. About 29 percent of IT leaders cite cybersecurity and data privacy as primary concerns, and 75 percent feel AI currently delivers minimal value compared to its investment.

Despite these challenges, AI agents play key roles in various industries.

  • Johnson & Johnson uses AI agents in drug discovery to optimize chemical synthesis.
  • Moody’s applies multi-agent systems for financial analysis.
  • eBay has AI agents assisting in coding and marketing, adapting to employee preferences over time.
  • Deutsche Telekom and Cosentino have AI-powered digital assistants handling internal employee inquiries and customer orders.

How RAG can help

For AI agents to be effective, they need to make accurate, reliable, and up-to-date decisions, which is where retrieval-augmented generation (RAG) comes in.

Most AI models operate on preexisting knowledge, meaning they rely only on information they were trained on; that data can become outdated, leading to inaccurate predictions and poor decision-making. This is where RAG provides a critical advantage. Instead of making guesses based on stale data, RAG enables AI agents to retrieve and process real-time, relevant information from external sources.

For IT leaders looking to enhance AI-driven decision-making, RAG offers significant advantages in key areas.

  • IT operations: AI agents with RAG can monitor system dialogues, network health, and security updates in real-time, adjusting proactively to prevent failures.
  • Cybersecurity: RAG-enhanced AI can detect and analyze emerging threats as they appear, helping cybersecurity teams anticipate risks.
  • Customer service: AI chatbots using RAG can provide accurate responses by retrieving the latest product details, company policies, and troubleshooting steps.
  • Legal and compliance: AI systems with RAG can track regulatory changes and assess risks in real-time, reducing chances of non-compliance.

AI agents are still evolving, and skepticism around their reliability remains valid. But with RAG, AI has the potential to be not just faster but smarter and more trustworthy, and that’s what will ultimately drive real value in enterprise IT.

4. Multimodal AI improves contextual understanding

AI is no longer about processing one type of data at a time. Multimodal AI is gaining traction because it can analyze text, images, audio, and video all at once, making AI systems more context-aware and responsive. The simple idea is AI is now learning to process different types of data together, just like how you use sight, sound, and language to understand the world.

Drawing from various research papers and expert analyses, here’s how multimodal AI is making an impact in IT.

  • IT security and operations: If your role involves cybersecurity, instead of just relying on security tools and network alerts, AI can now analyze traffic patterns, detect unusual activity, and even process system audio alerts to spot threats more effectively. Your security team can catch risks earlier with greater accuracy, reducing potential cyber threats that might otherwise slip through.
  • IT support and automation: Slow resolutions are frustrating, especially when dealing with IT help desk issues. Multimodal AI is changing that by combining voice recognition, ticket analysis, and system diagnosis to troubleshoot problems before they escalate. This could mean faster responses, less downtime, and a seamless IT experience for your team and end users. Plus, AI-driven platforms can now detect patterns in reported issues, helping you fix recurring problems before they cause more significant disruptions.
  • Enterprise applications: Whether you’re managing IT infrastructure or supporting business intelligence efforts, multimodal AI makes it easier to extract insights from structured data such as reports and databases and unstructured data such as handwritten notes, images, and voice recordings. However, implementing this isn’t as simple as flipping a switch; you will need more substantial computing power, optimized cloud strategies, and AI-ready infrastructure to handle the increased data demands. You might look into edge computing and AI-specific hardware to make this work.

Adopting multimodal AI isn’t without its challenges; data integration, privacy concerns, and high computational costs are barriers you’ll need to navigate. The question to answer is: How quickly can you build the right infrastructure to support multimodal AI?

5. AI frameworks for IT leaders and managers

Here are some AI frameworks that can help you streamline processes and optimize workflows.

  • LangGraph helps you manage complex workflows with built-in moderation, ensuring system reliability and enabling seamless human-agent collaboration.
  • CrewAI allows you to organize AI-driven teams, facilitating dynamic decision-making and autonomous task delegation to improve IT operations.
  • AutoGen simplifies the development of scalable, event-driven AI agent systems, making it easier for you to enable collaboration and asynchronous communication in automated IT processes.
  • HayStack enhances research and retrieval capabilities, improving AI-driven natural language processing (NLP) and knowledge management for IT support and troubleshooting.
  • LlamaIndex helps you efficiently index and retrieve structured and unstructured data, making AI-powered insights more accessible for better decision-making.

6. AI tech stacks for developers, engineers, and technical teams

AI tech stacks are the foundation for integrating AI into enterprise IT, combining machine learning models, AI frameworks, and cloud computing to improve scalability, automation, and performance. These tech stacks typically consist of four layers:

  • Application layer: The front-facing software, APIs, and AI-driven applications that automate tasks and enhance decision-making.
  • Model layer: The backbone of AI systems, featuring pre-trained and custom AI/ML models that handle automation, analytics, and complex data processing.
  • Data layer: Pipelines, storage solutions, and data management frameworks that fuel AI applications.
  • Infrastructure layer: Cloud, on-premise, and edge computing environments that support AI workloads and optimize performance.

Understanding the differences between AI frameworks and AI tech stacks

In short, AI frameworks guide IT leaders’ AI strategy and implementation, while tech stacks power the IT team’s execution and scalability.

AI frameworks provide a structured methodology for managing AI systems, focusing on governance, workflow automation, and process optimization. These frameworks help IT leaders and managers establish AI-driven strategies, ensuring compliance, scalability, and efficiency in enterprise environments.

AI tech stacks refer to the combination of tools, models, and infrastructure that enable AI applications. Tech stacks are essential for developers and engineers building AI-driven software, integrating machine learning, and managing data pipelines.

7. Hybrid cloud positions IT for AI-focused growth

AI-driven applications demand high computational power, dynamic storage, and flexibility, and hybrid cloud architectures are now essential for balancing performance with security compliance.

Integrating on-premise systems, private clouds, and public cloud services allows you to seamlessly manage AI workloads across multiple environments while maintaining cost efficiency, agility, and control over sensitive data. This hybrid approach optimizes workload distribution, allowing high-compute AI tasks to run in public clouds while keeping critical and regulated data on-premise.

With AI continuing to evolve, hybrid cloud strategies ensure real-time adaptability to AI processing demands. You must enforce solid data governance, AI model security, and compliance standards, all while ensuring that AI workloads remain operational, efficient, and cost-effective.

For industries such as finance, healthcare, and e-commerce where AI workloads are complex and high-volume, hybrid cloud models offer greater agility without compromising security. To remain competitive, you must develop a cloud-native AI infrastructure, optimize multi-cloud strategies, and adopt AI-driven automation tools to allocate resources and reduce operational risks dynamically.

Hybrid cloud solutions are expected to become the backbone of enterprise IT infrastructure, allowing IT professionals to deploy, scale, and manage AI workloads more efficiently. Assess your hybrid cloud strategy, ensure seamless integration, and position your IT operations for AI-driven growth.

8. AI for DevSecOps strengthens IT security

AI-driven cybersecurity solutions are transforming DevSecOps, helping you automate threat detection, vulnerability management, and compliance monitoring throughout the software development lifecycle. With machine learning and deep learning algorithms, you can detect threats in real time, identify system weaknesses, and respond to cyber risks automatically.

However, AI-driven security isn’t without challenges. AI-generated cyberattacks are on the rise, making it critical for you to integrate AI responsibly. Striking the right balance between innovation and strong security ensures that your data and IT infrastructure remain protected.

While automation boosts security efficiency, human oversight is still necessary. AI models can generate false positives or miss nuanced threats, so your security team must validate alerts and prevent alert fatigue. A manual verification process for critical vulnerabilities ensures a more balanced security approach.

Resource constraints can also be a challenge. AI-driven security tools require high-performance computing for real-time monitoring and data analysis, which can strain infrastructure. Cloud-based solutions offer scalability, but you must carefully manage resource allocation to optimize cost and performance.

With predictive threat intelligence, automated incident response, and adaptive threat modeling, you can stay ahead of cyber threats while keeping your DevSecOps pipelines agile and secure.

9. Quantum computing and AI convergence

Quantum computing converging with AI could be one of the decades’ most significant shifts in computing power.

AI is pushing the limits of traditional computing, especially in machine learning, data processing, and optimization problems. Quantum computing changes the game by handling complex calculations at speeds that traditional systems can’t match. Gartner and McKinsey predict AI-powered by quantum computing will drive innovation in fields including cybersecurity, supply chain management, and scientific research.

What does this mean for IT decision-makers? AI models that struggle with massive datasets will be able to process information exponentially faster, leading to better predictions, deeper insights, and real-time decision-making. Think faster fraud detection in finance, next-level drug discovery in healthcare, and fully optimized logistics operations.

Tech giants including IBM and Google are already investing in AI-quantum integrations. IBM’s Quantum Roadmap is focused on bringing practical quantum-enhanced AI to enterprise IT, while Google’s Quantum AI team is exploring how quantum algorithms can optimize machine learning models. If these advancements continue at their current pace, AI could soon handle tasks that were previously impossible due to computing limitations.

The convergence of quantum computing and AI isn’t just about futuristic technology — it’s about preparing for a shift in AI capabilities. As quantum computing evolves, companies leveraging AI for cybersecurity, logistics, and final modeling will gain a competitive edge.

10. Generative AI transforms content creation for IT marketing

Content marketing is essential for engaging IT buyers, explaining complex solutions, and building thought leadership. But keeping up with content demands isn’t easy.

Generative AI automates content creation, transforming how IT brands communicate, from blogs and white papers to videos, graphics, and interactive experiences. AI-powered tools make content production faster, more efficient, and highly personalized. AI streamlines content workflows while maintaining brand consistency and quality.

To be clear, AI isn’t supposed to replace your marketing team — AI should make them more efficient. If you’re not using generative AI to scale content, personalize messaging, and streamline workflows, your competitors probably are. In 2025, AI-powered content creation isn’t just an advantage — it’s a necessity.

Personalization with AI-driven content creation and automated workflows

Your customers expect clear, well-researched content that explains complex IT solutions. Tools like ChatGPT, Jasper, and Claude will help you create blogs, white papers, and technical case studies quickly and at scale.

These AI tools also support your marketing team in customizing messages for different buying personas, whether you’re targeting CIOs, IT managers, or procurement teams. AI-powered chatbots, email automation, and content engines adapt language and tone to match the needs of each audience.

Instead of spending hours writing blog posts, social media updates, or product descriptions, your team can use AI to generate content drafts, repurpose existing materials, and streamline approvals. AI is even helping marketers turn blogs into videos automatically.

AI-powered graphics and videos

Your marketing team can use AI for infographics, social media graphics, and ad creatives. Tools like Adobe Firefly, Dall-E, and Canvas’ AI assistants allow teams to create branded graphics instantly without a designer.

Video marketing is popular; however, video production costs are high. AI tools like Syntheisa and Runway ML make creating explainer videos, product demos, and customer testimonials easier.

Future of AI: Ethical considerations and human interaction

AI will continue to evolve because it constantly learns from the data fed by its users; and, the more it is exposed to new data, the more sophisticated future AI models and tools will be. Since AI will be the main focus of the future’s technological advancement, there are ethical considerations users need to be aware of as well as its human interaction.

Human-in-the-Loop (HITL) design

HITL design incorporates human monitoring at different levels of AI development and decision-making; this technique makes sure AI systems adhere to human values and ethical standards. HITL requires ongoing human monitoring, feedback, and intervention to help prevent biases, errors, and unintended consequences.

This year, HITL will be considered standard practice in AI application development, guaranteeing AI behaviors are guided by human judgement and ethical considerations.

Ethical AI development and deployment

Ethical AI development entails establishing systems that are transparent, accountable, and equitable; it involves resolving common issues including bias, discrimination, and confidentiality. Developers must guarantee AI systems do not reproduce or worsen existing societal conditions.

Ethical AI deployment should take into account the broader societal implications of AI, such as potential job displacement and economic shifts.

Balancing AI autonomy and human oversight

As AI systems become more autonomous, it is critical to establish a balance between autonomy and human supervision. While AI can improve efficiency and decision-making, it should not be used without proper human management and responsibility.

To keep this balance between human and AI, specific boundaries for AI actions must be established, and the same goes for human involvement when necessary. This balance will be important in applications such as healthcare, finance, and public safety, where AI judgments can have serious ramifications for people’s lives.

FAQs

What are the next big trends in AI?

The next big trend in AI will likely be the advancement of self-supervised learning and more efficient AI architectures, such as those based on transformers and neuromorphic computing. Self-supervised learning allows models to learn from unlabeled data, reducing the need for extensive data annotation and enhancing the adaptability of AI systems.

Additionally, integrating AI with edge computing and federated learning is expected to improve real-time processing and privacy by enabling models to learn and infer locally on devices without centralizing data.

What is the next AI breakthrough?

One of the anticipated breakthroughs in AI is the development of more advanced generative AI models that can create realistic content such as text, images, and even synthetic data with minimal human intervention. Advances in large language models (LLMs) and multimodal models that can understand and generate text, images, and audio are also on the horizon. These breakthroughs could lead to significant improvements in AI’s ability to handle complex tasks, create content, and interact more naturally with humans.

What fields will AI transform?

AI is expected to transform a variety of fields, with automation potentially displacing certain roles in industries like manufacturing, logistics, and customer service. Tasks that involve routine and repetitive activities are particularly susceptible to automation. However, rather than outright replacement, AI is more likely to optimize human roles by handling repetitive tasks and enabling workers to focus on more complex and creative aspects of their jobs.

Fields such as healthcare and finance may experience particularly significant changes, with AI supporting decision-making and enhancing efficiency rather than fully replacing human roles.

Bottom line: AI trends can offer significant benefits, though keep ethics top of mind

Advances in AI architectures that are driving innovative applications across diverse fields, from climate action to transportation to media, will transform how we live and work. This progress means it’s crucial to address concerns about bias and fairness in AI to ensure these technologies benefit everyone; AI must be monitored as it grows more powerful. By focusing on ethical practices, you can make the most of AI’s potential while navigating its challenges.

The post 2025 AI Trends Driving the Biggest Tech Transformations Today appeared first on eWEEK.

]]>
Live Stream: AWS Partners LIVE! https://www.eweek.com/news/join-the-aws-partners-live-stream-december-2024/ Mon, 02 Dec 2024 14:00:00 +0000 https://www.eweek.com/?p=230362 Tune in to AWS Partners LIVE!, streaming directly from AWS marquee events like re:Invent and re:Inforce, to catch all the breaking news and exclusive insights. Sponsored by AWS.

The post Live Stream: AWS Partners LIVE! appeared first on eWEEK.

]]>

Get a front row seat to hear real stories from customers and AWS leaders about navigating pressing topics, learn about new product launches, watch demos, and get behind-the-scenes insights. You can catch all the excitement on the AWS Partner Network YouTube channel, where we’ll be sharing breaking news and exclusive insights alongside our competency partners.

Day 3: Industries LIVE!
Thursday, December 5th | 10a.m. – 4p.m. PT

Day 1: GenAI LIVE!
Tuesday, December 3rd | 11a.m. – 5p.m. PT

Day 2: Security LIVE!
Wednesday, December 4th | 11a.m. – 5p.m. PT

The post Live Stream: AWS Partners LIVE! appeared first on eWEEK.

]]>
eWEEK eSPEAKS Video Podcast with James Maguire https://www.eweek.com/video/eweek-espeaks-video-podcast-james-maguire/ Fri, 17 Sep 2021 17:54:04 +0000 https://www.eweek.com/?p=219416 Episode 36: TigerGraph’s Todd Blaschka: The Uses and Evolution of Graph Technology Todd Blaschka, COO and CRO at TigerGraph, explains the advantages of graph analytics and graph databases. https://www.youtube.com/watch?v=QCzUHxpQl88 Episode 35: Cohesity’s Mohit Aron: Managing Data to Fight Ransomware Mohit Aron, CEO of Cohesity, looks at the current state of ransomware, and provides advice about […]

The post eWEEK eSPEAKS Video Podcast with James Maguire appeared first on eWEEK.

]]>
Episode 36: TigerGraph’s Todd Blaschka: The Uses and Evolution of Graph Technology

Todd Blaschka, COO and CRO at TigerGraph, explains the advantages of graph analytics and graph databases.

Episode 35: Cohesity’s Mohit Aron: Managing Data to Fight Ransomware

Mohit Aron, CEO of Cohesity, looks at the current state of ransomware, and provides advice about managing data for optimum security and productivity.

Episode 34: The Role of Data Governance in Effective Data Management

Four industry experts discuss the growing challenges in data governance, and highlight the issues that need to be resolved for better data management.

Episode 33: Cloudera’s Ram Venkatesh on Hybrid Cloud and the Cloudera Roadmap

Ram Venkatesh, Chief Technology Officer at Cloudera, discusses the company’s shifts, and explains how the Cloudera Data Platform serves a hybrid environment.

Episode 32: Semperis’s Mickey Bresman: How to Secure Your Active Directory

Mickey Bresman, Co-founder and CEO of Semperis, provides three tips for securing your Active Directory against cyberthreats.

Episode 31: Veritas’s Simon Jelley: How to Guard Against Ransomware

Simon Jelley, GM & VP of Product at Veritas Technologies, explains why ransomware is so difficult to defend against – and outlines critical best practices to lessen the threat.

Episode 30: Hitachi Vantara’s Radhika Krishnan on Data Fabric and Data Management

Radhika Krishnan, Chief Product Officer for Hitachi Vantara, explains the role of data fabrics and discusses how data storage and data analytics are merging.

 

Episode 29: Extreme Networks’ Nabil Bukhari on AI in Networking, and the Democratization of Technology

Nabil Bukhari, CTO and CPO of Extreme Networks, discusses how today’s networks need AI for proper monitoring, and also dives into how the democratization of technology is changing the tech sector in profound ways.

 

Episode 28: Weka’s Jonathan Martin on Enterprise Data Storage: Beyond Incremental Change

Jonathan Martin, the president of Weka, discusses how today’s exponential data growth is reshaping data storage, particularly in the analytics sector.

 

Episode 27: BMC’s Margaret Lee on AI Service Management: Use Cases, Future Trends

Margaret Lee, GM & SVP, Digital Service and Operations Management at BMC Software, discusses how AISM pairs with technologies like AIOps, and forecasts the future of AISM.

 

Episode 26: IBM’s Daniel Hernandez on AI and Data Fabric

Daniel Hernandez, General Manager, Data and AI at IBM, discusses how data fabric enables a cohesive data strategy to better enable artificial intelligence.

 

Episode 25: Sumo Logic’s Dave Frampton on Creating a Cloud Security Strategy

Dave Frampton, VP of Security Solutions at Sumo Logic, discusses the new threat surfaces that companies need to focus on protecting.

 

Episode 24: The CIO/CMO Relationship: Promoting Digital Transformation

Four top industry thought leaders discuss the key issues in the CIO-CMO relationship, and whether the challenges are a tech or a human problem.

 

Episode 23: Riverbed’s Vincent Berk on Improving Network Security

Vincent Berk, CTO and Chief Security Architect at Riverbed, discusses network visibility and how to transform network and application data into actionable security intelligence.

Episode 22: Rubrik’s Bipul Sinha on Preventing Ransomware Attacks

Bipul Sinha, Co-Founder and CEO of Rubrik, explains Zero Trust’s role in blocking ransomware attacks, and discusses security in a multicloud world.

Episode 21: SAP’s Irfan Khan on ‘Analytics Everywhere’

Irfan Khan, president, HANA Database & Analytics at SAP, discusses the value of an end-to-end technology platform that incorporates analytics throughout.

 

Episode 20: Komprise Co-Founder Krishna Subramanian on Data Management-as-a-Service

Krishna Subramanian, co-founder and president, Komprise, talks about using Data Management-as-a-Service to manage data in a multi-cloud environment.

 

Episode 19: Tableau’s Philip Cooper on Data Analytics ‘Available to All’

Philip Cooper, VP of Product, Tableau, discusses the trend toward making data analytics tools available to employees throughout the organization, not merely the C-suite.

 

Episode 18: Oracle Cloud’s Ross Brown on Oracle and Multicloud Computing

Ross Brown, VP of Product Marketing for Oracle Cloud, discusses trends in multicloud, and points out key differentiators for Oracle Cloud.

 

Episode 17: HashiCorp’s Armon Dadgar on Zero Trust Security

Armon Dadgar, Co-Founder and CTO of HashiCorp, discusses why “castle and moat” is outdated. Plus: what’s the future of cybersecurity in a multicloud world?

 

Episode 16: Understanding NetDevOps: Expert Advice

Three leading experts provide a deep dive into NetDevOps, including its history, best practices and common challenges, along with a look to the technology’s future.

Episode 15: Druva CEO Jaspreet Singh: Data Backup in the Cloud

Jaspreet Singh, CEO of Druva, spoke about key trends and challenges facing companies in cloud-based data backup.

 

Episode 14: VMware’s Bernard Golden: Advice for Cloud Clients

Bernard Golden, Executive Technical Advisor at VMware, talks about cloud’s evolution, the rise in complexity in IT, and the advice he gives to his VMware cloud clients.

 

Episode 13: Equinix’s Karl Strohmeyer on Data Centers, Cloud, and Evolving Infrastructure

Karl Strohmeyer, Chief Customer and Revenue Officer for Equinix, provides a portrait of the rapidly evolving IT infrastructure market.

 

Episode 12: CIOs Discuss the Post-Covid World: Will Hybrid Work Continue?

Four major thought leaders in the CIO community discuss the remarkable changes created by the pandemic. Which shifts are temporary, and which will be ongoing?

 

Episode 11: Precisely CEO Josh Rogers: Understanding Data Integrity

High levels of data integrity and data quality enable a data analytics process to offer truly accurate actionable insight.

 

Episode 10: Cisco’s Kaustubh Das: The Importance of Commonality in the Cloud

Kaustubh Das, VP and GM of Cloud and Compute at Cisco, discusses why a coherent overall approach is so essential in an enterprise cloud deployment.

Episode 9: Cognizant’s Paul Roehrig: Trends in Digital Transformation

Paul Roehrig, Head of Strategy, Cognizant Digital Business & Technology, discusses the complexities of digital transformation – and also defines this oft-used word.

Episode 8: Arkose Labs’ Kevin Gosschalk: The Fight Against Online Fraud

Kevin Gosschalk, CEO and Founder, Arkose Labs, discusses the state of online fraud, including the pandemic’s effect.

 

Episode 7: Alation CEO Satyen Sangani on Trends in Data Catalogs

Alation CEO Satyen Sangani talks about all things Data Catalog. We’ll look at what exactly a data catalog does, and talk about some tips and best practices for optimizing a data catalog.

 

Episode 6: NVIDIA’s Manuvir Das: The Future of Artificial Intelligence in the Enterprise

Manuvir Das, Head of Enterprise Computing at NVIDIA, discusses the current state and future trends in enterprise artificial intelligence.

 

Episode 5: Dell Sr. VP Deepak Patil: Cloud Computing in 2021

Deepak Patil, Senior Vice President, Cloud Platforms & Solutions, at Dell, discusses the importance of hybrid cloud, and also looks at future directions in the enterprise cloud market.

 

Episode 4: Micron’s Raj Hazra: How Data Infrastructure is Evolving

An industry leader discusses how the explosive growth in data and AI is driving equally fast growth in memory and compute capacity.

 

Episode 3: OpsRamp’s Ciaran Byrne: AIOps Trends

Ciaran Byrne, VP of Product Management at OpsRamp, discusses the AIOps market, including the key challenges facing this fast-growing sector.

Episode 2: Juniper’s Bob Friday: Preparing for AI in Your Business

Bob Friday, CTO of Juniper’s AI-Driven Enterprise Business, talks about taking steps to help your business deploy AI, focusing on the need for quality data.

Episode 1: Interview with Keith White of HPE Greenlake Cloud Services

The general manager of HPE GreenLake discusses the rapid changes in the cloud market, including how cloud users are focusing on growth past-pandemic.

The post eWEEK eSPEAKS Video Podcast with James Maguire appeared first on eWEEK.

]]>
Debunking Nagging Cloud Adoption Myths https://www.eweek.com/cloud/debunking-nagging-cloud-adoption-myths/ https://www.eweek.com/cloud/debunking-nagging-cloud-adoption-myths/#respond Wed, 03 Jun 2020 00:03:00 +0000 https://www.eweek.com/uncategorized/debunking-nagging-cloud-adoption-myths/ Forward-looking companies are embracing the so-called all-cloud enterprise, using the speed, agility and flexibility that the cloud offers. Gartner Research predicts the worldwide public cloud market will reach $331.2 billion by 2022.  Cloud computing has evolved in many ways from its origin as application service providers in the late 1990s, making it a new norm […]

The post Debunking Nagging Cloud Adoption Myths appeared first on eWEEK.

]]>
Forward-looking companies are embracing the so-called all-cloud enterprise, using the speed, agility and flexibility that the cloud offers. Gartner Research predicts the worldwide public cloud market will reach $331.2 billion by 2022. 

Cloud computing has evolved in many ways from its origin as application service providers in the late 1990s, making it a new norm for existing and net new applications. More companies–as they gain cloud maturity–are shifting from cloud-first toward an all-cloud/cloud-only model. Cloud evolution is happening everywhere. Software plus hardware shifted to appliances which quickly shifted to as a service. We see this in cloud computing, data warehousing, CRM and IT service management in the cloud.

Some companies, however, are still playing “cloud catch-up.” There are persistent cloud adoption myths that may be to blame, acting as barriers and preventing companies from leveraging the superpowers of the cloud to boost efficiency, security and innovation.

This eWEEK Data Points article uses industry information from Chadd Kenney, a former executive at Pure Storage and EMC, who is now vice president and chief technologist at startup Clumio, a data backup and recovery software-as-a-service (SaaS) provider.

Data Point No. 1: The Cost Perception Myth

Many people think the cloud is more expensive than it is, and that buying something as a service is more expensive than an on-premises solution. In actuality, cost is only an issue when companies are using the cloud inefficiently or not fully leveraging the cloud to their benefit. It’s impossible for any enterprise to compete (i.e. build their own solution) with what the public cloud offers in terms of innovation, infrastructure efficiencies, flexibility, scalability, etc. 

Most companies don’t have the manpower, talent or time to continually tinker with and optimize cloud solutions. This is where leveraging the innovation of the public cloud can help, instead of using it like another co-location and only replacing the on-prem infrastructure. SaaS and PaaS solutions offload some inefficiency and optimize for cloud while increasing the focus on strategic business aspects versus infrastructure.

Data Point No. 2: The ‘One and Done’ Myth

Many pros think that once they move apps to the cloud, that is the end game. But true innovation means constant evolution. You must continue to integrate, iterate and innovate. One of the biggest mistakes companies make is failing to adapt quickly, as new technologies emerge. If you just look at your cloud journey as “lift and shift” being the final destination, you aren’t deriving the full benefits of the cloud.

Consider all the cloud-related innovation to date: The local data center has shifted to co-location and then to the cloud (AWS, Azure, GCP, Oracle). Computing has evolved from bare metal servers to virtualization and now to serverless. The database has made the journey from software and storage to database appliances to the cloud (AWS RDS). Backup has evolved from legacy on-prem hardware and software to hyper-converged appliances to a backup-as-a-service model.

Data Point No. 3: The ‘Cloud Isn’t Secure’ Myth

That’s the perception, but it’s not reality. It’s been reported that 66% of IT professionals say security is their most significant concern when adopting an enterprise cloud computing strategy (source: Forbes). The cloud in general is very secure; that’s a major part of the cloud services business. Amazon’s responsibility is to provide security of the cloud. Your job (as an enterprise) is to provide security in the cloud. 

Sometimes people mess up on the security in the cloud, which causes news reports about the cloud being hacked. The large cloud providers already implement compliance programs for HIPAA, PCI DSS, FEDRAMP, SOX and many others. Every time a provider adds a new service or feature, those compliance certifications must be re-upped to ensure they meet the requirements of clients. 

This re-upping process is difficult and expensive for enterprises to take on as a DIY task, and the result would still not be as effective as what they could get “out of the box” from the cloud provider.

An all-cloud model eliminates extra hardware and software to size, configure, manage or buy, and this is the way forward. By debunking the aforementioned cloud adoption myths, more enterprises can focus on larger business initiatives rather than on the heavy lifting of racking, stacking and powering servers.

If you have a suggestion for an eWEEK Data Points article, email cpreimesberger@eweek.com.

The post Debunking Nagging Cloud Adoption Myths appeared first on eWEEK.

]]>
https://www.eweek.com/cloud/debunking-nagging-cloud-adoption-myths/feed/ 0
BigPanda Provides Free 90-Day Access to IT Ops Platform https://www.eweek.com/development/bigpanda-provides-free-90-day-access-to-it-ops-platform/ https://www.eweek.com/development/bigpanda-provides-free-90-day-access-to-it-ops-platform/#respond Thu, 21 May 2020 16:00:00 +0000 https://www.eweek.com/uncategorized/bigpanda-provides-free-90-day-access-to-it-ops-platform/ BigPanda, a new-gen provider of what it claims is the first “autonomous operations” development platform that is infused with machine learning algorithms, this week announced it will make available for free for 90 days an instance of its software to IT professionals working from home to help combat the COVID-19 pandemic.    The free 90-day accelerator program is […]

The post BigPanda Provides Free 90-Day Access to IT Ops Platform appeared first on eWEEK.

]]>
BigPanda, a new-gen provider of what it claims is the first “autonomous operations” development platform that is infused with machine learning algorithms, this week announced it will make available for free for 90 days an instance of its software to IT professionals working from home to help combat the COVID-19 pandemic.   

The free 90-day accelerator program is called Ops from Home. The purpose is to give IT operations, network operating center, DevOps and site reliability engineering teams access to BigPanda’s event-correlation and incident-automation platform with no obligation for participants after the three-month time period.

Back on March 16, the White House issued its Coronavirus Guidelines for America, requiring that the essential critical workforce and companies that provide health care services, pharmaceutical companies and food supply organizations maintain their normal work schedules. Many of the IT Ops teams responsible for keeping the lights on are working from home, struggling to support and ensure high availability of their most critical services. 

The BigPanda IT Ops from Home accelerator program aims to help these organizations as well as select large enterprises that are struggling to maintain high service availability.

IT Ops from Home is designed for teams struggling with multiple monitoring tools, duplicate alerts and incidents about the same IT problem, hard-to-identify root causes, a lack of automation, minimal situational awareness across teams, dealing with system outages due to a high volume of traffic, and collaboration among distributed teams. 

The program offers the following benefits:

  • Ninety-day no-cost, no-obligation access to BigPanda’s core platform, which includes full AI/ML-driven event correlation capabilities.
  • Out-of-the-box integrations for select critical applications, services and monitoring tools, such as AWS, AppDynamics, Cloudwatch, Splunk, Azure, Jira and Slack.
  • A rapid and thorough onboarding experience.
  • Free support for 90 days and free training via virtual classroom for IT Ops, NOC and DevOps/SRE team members.

For more information, go here.

The post BigPanda Provides Free 90-Day Access to IT Ops Platform appeared first on eWEEK.

]]>
https://www.eweek.com/development/bigpanda-provides-free-90-day-access-to-it-ops-platform/feed/ 0
A CEO’s Life Lessons Learned After Beating COVID-19 https://www.eweek.com/it-management/a-ceo-s-life-lessons-learned-after-beating-covid-19/ https://www.eweek.com/it-management/a-ceo-s-life-lessons-learned-after-beating-covid-19/#respond Thu, 14 May 2020 10:15:00 +0000 https://www.eweek.com/uncategorized/a-ceos-life-lessons-learned-after-beating-covid-19/ By Jim Barrett I’m a lucky guy. I’ve survived an ordeal that has killed tens of thousands of others, and I’ve learned a few things about myself along the way … things I think will make me both a better CEO as well as a better person. It started on St. Patrick’s Day, March 17. […]

The post A CEO’s Life Lessons Learned After Beating COVID-19 appeared first on eWEEK.

]]>
By Jim Barrett

I’m a lucky guy. I’ve survived an ordeal that has killed tens of thousands of others, and I’ve learned a few things about myself along the way … things I think will make me both a better CEO as well as a better person.

It started on St. Patrick’s Day, March 17. I had recently traveled to San Francisco and Washington, but on this day, I felt ill with a fever, cough and shallow breathing. And while I wasn’t sure of what was wrong, I’d seen all the stories about the novel coronavirus and decided to take no chances. I immediately self-isolated at my home in South Carolina, even though there appeared to be a multitude of misinformation at the time. I moved quickly, but still had my doubts that I would test positive. I was wrong.

I was tested four days later by my primary care provider. Six days after that, I got the official word: Yes, I’d tested positive for the COVID-19 virus. Nobody else in my family showed symptoms, but we couldn’t be sure: No one had been tested because the tests weren’t available. My primary concern was the safety of my family and making sure that no one else had gotten sick from me. Thankfully, that was the case.

But as the CEO of my company, Edge Technologies, I also had my concerns about the company; if I was not able to help direct the company, who would? Given that I was sleeping more than 12 hours a night, from 6 p.m. to 8 a.m. as my body tried to fight off the virus, that wasn’t an idle concern. That was especially true as many of our existing clients came at us, all asking for an increase in our ability to serve their needs, as their own workers (and millions of others around the world) began working from home.

Thankfully, I’ve recovered and am back to work. But I was struck along the way with several points that I learned, or re-learned, as a CEO:

  • Visibility in business is paramount. Too often, we take that for granted … we know where the information is and assume that others will know what we do, and that it’ll always be at their disposal. Well, it won’t; without real-time information being made available on a real-time basis, both internally and externally, companies quickly realize their shortcomings, both potential and real. This has proved especially true with the move toward remote workers, as business is not being done as usual. Many of our clients realized this and needed our help on an immediate basis; they realized serious gaps in the data they needed to access regularly, telling us “nobody could have seen this coming.”
  • Companies don’t have a lack of information. I realized that even when our clients (and us) were able to access the raw data, there was often the lack of ability to put it into converged context. The people who might be able to do that … usually to be found in the next office … were working from their homes as I was, stretching out the decision-making process to a far greater degree than it was usually. They had the data, but they had trouble making sense of it.
  • Communication is key. I realized that too often, I assumed I knew what my team members were thinking; this was especially true because we’re almost a fully remote workforce. But as our customers came to us as the COVID crisis unfolded, telling us of communications gaps that had erupted and urgently asking for our help in upgrading their systems to enable what I call “cogent connected data,” I realized that especially at times like this, you cannot over-communicate. I’ve always known that your team members need to know what’s going on at all points, if anything is changing, and … most importantly … why. I thought we did a pretty good job before all of this started, and we have to a large degree. But I realized amid the rapidly changing nature of this crisis that we constantly need to evaluate how we’re doing, and figure ways we can improve our communications with our team members, our partners and our clients … both talking, as well as listening to what they have to say.

I also learned a few things about myself, things that I never would have taken the time to consider before this enforced timeout:

  • It’s not as bad as you think. Being in quarantine had its moments, sure, but I found it gave me more time to think about my business and who I am as a person. My family was incredible; at mealtime, they’d prepare my food, leave it on a tray outside my door, knock on the door and move away. (They’re OK, by the way; none of them tested positive, for which I’m grateful.) I realized I could make do, at least in the short-term. Technology, of course, proved invaluable to keep in touch with my family as well as my business; I think I tried every single video conferencing tool on the market today.
  • Being a delegator is a good thing. I was working at half-capacity at most for the two weeks I was in quarantine; sleeping 14 hours a day will do that to you, as will the bad headaches I was getting while awake. I’ve always considered myself a delegator, hiring great people, pointing them in the direction we want to go as a business and letting them run with it. But this was a huge change of pace for me, and I came away with a renewed sense of appreciation for empowering others to pick up the slack. I’d tell others that if you hire the right people, this gives you more time to enjoy the better things in life. Or, in my case, to recover.
  • Family above all else. I couldn’t have done this without my family. They made it possible for me to focus on getting better. We came together, realizing the importance of the moment, and worked on making things run smoothly. Technology made it possible for my teenage boys to continue their online learning, and as noted above, video conferencing made it possible for us to keep in touch, even if we were just one room away. I never thought I’d be using the technology to ask my family for another cup of coffee. But I did.

As I said at the beginning of this article, I’ve been through an ordeal. I know it’s nowhere as serious as many of the others who’ve been infected, or who have paid the ultimate price. I hurt for them and for their families. I know I’ve been fortunate to have recovered as easily and as quickly as I did, and I’m determined to use the lessons I had the time to think about and learn during that time to move forward and be a better CEO. And a better person.

Jim Barrett is Chief Executive Officer at Edge Technologies based in Northern Virginia.

The post A CEO’s Life Lessons Learned After Beating COVID-19 appeared first on eWEEK.

]]>
https://www.eweek.com/it-management/a-ceo-s-life-lessons-learned-after-beating-covid-19/feed/ 0
Why When It Comes to Recovery, Backup Useless, DR Priceless https://www.eweek.com/storage/why-when-it-comes-to-recovery-backup-useless-dr-priceless/ https://www.eweek.com/storage/why-when-it-comes-to-recovery-backup-useless-dr-priceless/#respond Tue, 12 May 2020 16:00:00 +0000 https://www.eweek.com/uncategorized/why-when-it-comes-to-recovery-backup-useless-dr-priceless/ Backups are obviously important. Everyone—both individuals and businesses—should regularly back up their devices and data in case they become compromised. However, there is a distinct difference between backing up the data generated and owned by an individual versus the data created by an organization, no matter the size. While simple backups may be sufficient for […]

The post Why When It Comes to Recovery, Backup Useless, DR Priceless appeared first on eWEEK.

]]>
Backups are obviously important. Everyone—both individuals and businesses—should regularly back up their devices and data in case they become compromised. However, there is a distinct difference between backing up the data generated and owned by an individual versus the data created by an organization, no matter the size.

While simple backups may be sufficient for an individual, traditional backups alone are not enough for businesses. A primary purpose of having a secondary set of data is getting a business up and running after data loss, data corruption or ransomware, and backup systems are simply not up for the task of disaster recovery (DR).

It’s time to rethink data protection. Archiving data and sending it off to some faraway place with the hope that it will never be needed again is antiquated. Businesses can no longer afford to wait weeks, days or even hours to restore their data. Recovery must be instant. Thanks to the public cloud, DR and backup have been radically transformed in ways that make this possible.

In this eWEEK Data Points article, Sazzala Reddy, co-founder and chief technology officer at Datrium, explains why backup is useless for DR and how to do DR right to recover quickly from disasters big and small. 

Data Point No. 1: There are two ugly truths about backups.

One: It’s a Schrödinger’s backup situation: The state of a backup is unknown until you have to restore from it. 

Two: Backup systems are built for backing up data, not for recovery. It will take you days or weeks to recover your data center from backup systems. Mass recovery was never the design goal of backups.

Data Point No. 2: System availability is of paramount importance.

Times have changed. In today’s on-demand economy, we expect our IT systems to always be up and running. Any amount of downtime impacts customers, employees and the bottom line. 

Data Point No. 3: Ransomware is emerging as the leading cause of downtime.

Downtime can be caused by floods, tornados, fires, accidental human error and other unexpected events. However, ransomware, a new and rapidly growing phenomenon, is emerging as a leading cause of downtime. According to The State of Enterprise Data Resiliency and Disaster Recovery 2019, disasters ranging from natural events to power outages to ransomware affected more than 50% of enterprises in the last 24 months. Among these disasters, ransomware was identified as the leading cause with 36% of respondents reporting having been the victim of an attack. 

Data Point No. 4: Disaster recovery is more important than ever.

The sharp increase in ransomware attacks and other data threats has made backup useless and DR more important than ever before. While there are newer backup systems on the market today, they still aren’t capable of rapid and reliable recovery. Today, speed of recovery and how quickly you can get back online after an event are the name of the game—and winning requires a comprehensive DR-focused strategy. 

Data Point No. 5: Backups are useless in the event of a disaster.

While backups are a great first step, they are not an effective DR strategy due to the sheer amount of time and manual labor required to recover from traditional backups after a disaster. Imagine 100 terabytes of data stored in a backup system, which can restore at 500MB/sec (which is generous). In a disaster scenario, it will take two-plus days to copy the data from the backup system into a primary storage system. Effective and fast DR requires automation in the form of disaster recovery orchestration.  

Data Point No. 6: Full DR orchestration entails a series of steps.

Step 1: Access to the right data in a different infrastructure (backup and storage vendors sometimes forget that DR is not just about this step). 

Step 2: Bring up the workloads, in the right order, on the right systems, dealing with differences in networking, etc. This is vastly more practical and automatable for virtualized workloads than it is for physical. 

Step 3: Fail everything back to the originating site with the same concerns for workload sequencing, mapping, etc. (These last two steps require runbook orchestration, which is a key component to comprehensive DR.) 

Data Point No. 7: Modern business requires instant RTO.

Because no business can afford to lose access to their systems for hours, days or even weeks, effective disaster recovery needs instant RTO (recovery time objectives), and the bottom line is that legacy backup systems were not designed for that. Effective DR solutions need to deliver instant RTO restarts. 

Data Point 8: The public cloud has changed the game for DR.

The public offers on-demand compute and elastic storage. You can get your data to a geographical region of choice on low-cost media and spin up compute when disaster strikes, so you can work with that data. Additionally, you only pay for resources when you use them in a disaster or testing. That’s how the cloud is supposed to be used—elastic and pay as you go. It’s like only paying for insurance after you’ve had a car accident. 

Data Point No. 9: The key to effective cloud DR is converging cloud backup and DR.

A key part of DR is getting the data to a second site that’s unaffected by the disaster and has compute resources available for post-recovery operation. To do this, your backups need to be deduplicated and stored in the cloud in a steady state, such as a public cloud like AWS S3. Then, in the event of a disaster, runbook automation instantly turns these backups into live VMs and you get instant RTO for 1000s of VMs. 

Data Point No. 10: To be effective, DR needs to be simple, fast and affordable.

By leveraging the public cloud and new technologies, it is now possible to converge backup (low-cost media, granular recovery) and DR (orchestration software, random I/O performance). This truly simplifies DR with an approach that enables instant failover of an entire data center with the push of a button, eliminating the need to cobble together all of the backup and DR software pieces manually.

If you have a suggestion for an eWEEK Data Points article, email cpreimesberger@eweek.com.

The post Why When It Comes to Recovery, Backup Useless, DR Priceless appeared first on eWEEK.

]]>
https://www.eweek.com/storage/why-when-it-comes-to-recovery-backup-useless-dr-priceless/feed/ 0
eWEEK Moves to New Publisher, TechnologyAdvice.com https://www.eweek.com/innovation/eweek-moves-to-new-publisher-technologyadvice-com/ https://www.eweek.com/innovation/eweek-moves-to-new-publisher-technologyadvice-com/#respond Thu, 07 May 2020 15:00:00 +0000 https://www.eweek.com/uncategorized/eweek-moves-to-new-publisher-technologyadvice-com/ QuinStreet Inc.’s top technology information site, eWEEK, has been acquired by a new publisher, TechnologyAdvice.com, the two companies announced May 7.  Foster City, Calif.-based digital marketing service provider QuinStreet had been eWEEK’s home for the last eight years after it acquired the technology news, trends and product information site from Ziff Davis Enterprise in February […]

The post eWEEK Moves to New Publisher, TechnologyAdvice.com appeared first on eWEEK.

]]>
QuinStreet Inc.’s top technology information site, eWEEK, has been acquired by a new publisher, TechnologyAdvice.com, the two companies announced May 7. 

Foster City, Calif.-based digital marketing service provider QuinStreet had been eWEEK’s home for the last eight years after it acquired the technology news, trends and product information site from Ziff Davis Enterprise in February 2012.  

Terms of the transaction were not announced.

In fact, QuinStreet—which already has highly profitable businesses in insurance, financial services and education—sold all 39 of its B2B tech publications—eWEEK, Datamation, Webopedia, IT Business Edge and eSecurityPlanet among them—to TechnologyAdvice, a Nashville, Tenn.-based B2B marketing company that creates opportunities for technology buyers to find the best business technology and technology vendors to connect with their ideal customers. Editorial operations at the publications will remain unchanged in the short term.

Focuses Only on B2B IT

TechnologyAdvice is a privately held company, founded by CEO Rob Bellenfant in 2006, that has built its own demand-generation business steadily and continues to grow. Unlike QuinStreet, TechnologyAdvice concentrates only on B2B marketing services, with deep experience in working with enterprise IT businesses—which is eWEEK’s sole focus.

TechnologyAdvice was named to the Inc. 5000 list of “America’s Fastest-Growing Private Companies” in 2014, 2015, 2016 and 2017.

At the same time on May 7, TechnologyAdvice announced it acquired Quebec-based Project-Management.com. Project-Management.com serves practitioners and technology companies in the project management industry with technology reviews, training and thought leadership content. 

The acquisitions of Quinstreet’s B2B business and Project-Management.com will round out the existing TechnologyAdvice marketing capabilities with:

  • custom content products, such as online events, eBooks, original and sponsored content;
  • intent-based marketing programs using first-party data;
  • digital advertising offerings that provide personalized marketing within a buyer’s natural research journey; and
  • highly specialized professional content across IT, software development and security.

Creating New Opportunities for Tech Buyers

“These acquisitions help us further our purpose, which is to create opportunities for technology buyers, technology vendors, our team members, and our communities,” TechnologyAdvice’s Bellenfant said. “Our ability to serve new and existing B2B technology clients in innovative and meaningful ways has just exploded. These deals solidify our expansion from specialized lead-generation services to a full-service media company that can offer clients a range of media products across the funnel and to technology companies of any size.”

eWEEK, whose predecessor, PC Week, was a weekly newspaper (and later a very popular weekly magazine until 2011) founded in Boston in 1983, updated its name in 2000 after its coverage extended into the enterprise sector, far beyond the PC segment. It is among the longest-running IT trade publications in the world and continues to attract a loyal readership of IT managers, C-level executives, software developers and IT segment investors.

Editor Chris Preimesberger, who’s been with the publication since 2004, will remain in charge of the publication’s editorial operations.

“It’s wonderful and reassuring that TechnologyAdvice sought us out and wants to invest in our mission to bring the latest, most relevant IT product/services/company information and trends to our readership,” Preimesberger said. “As enterprise IT continues to branch out and become more and more complicated to buy, use and explain, the need for competent news and analysis from a third-party publication with a respected history becomes increasingly important to technology buyers.”

eWEEK will continue to showcase its well-known and respected writers and analysts, including Wayne Rash, Charles King, Rob Enderle, Zeus Kerravala, Peter Burris, Brian Solis, Frank Ohlhorst, Eric Kavanagh and others on a daily basis. Preimesberger, who created #eWEEKchat and eWEEK’s Innovation section in 2013 and features such as IT Science case studies and eWEEK Data Points articles in 2016, said he expects to add some new names to the writing/analysis lineup, in addition to new types of content to the publication.

In fact, eWEEK has already started its new eSPEAKS video-interview series, which will be publishing on the eWEEK YouTube channel soon. A new podcast series is also in the works.

The new email address for Preimesberger is chris.preimesberger@technologyadvice.com. An alternate address, cpreimesberger@eweek.com, is also operational.

The post eWEEK Moves to New Publisher, TechnologyAdvice.com appeared first on eWEEK.

]]>
https://www.eweek.com/innovation/eweek-moves-to-new-publisher-technologyadvice-com/feed/ 0
Why White-Box Models in Enterprise Data Science Work More Efficiently https://www.eweek.com/big-data-and-analytics/why-white-box-models-in-enterprise-data-science-work-more-efficiently/ https://www.eweek.com/big-data-and-analytics/why-white-box-models-in-enterprise-data-science-work-more-efficiently/#respond Thu, 30 Apr 2020 16:00:00 +0000 https://www.eweek.com/uncategorized/why-white-box-models-in-enterprise-data-science-work-more-efficiently/ Data science is the current powerhouse for organizations, turning mountains of data into actionable business insights that impact every part of the business, including customer experience, revenue, operations, risk management and other functions. Data science has the potential to dramatically accelerate digital transformation initiatives, delivering greater performance and advantages over the competition.  However, not all […]

The post Why White-Box Models in Enterprise Data Science Work More Efficiently appeared first on eWEEK.

]]>
Data science is the current powerhouse for organizations, turning mountains of data into actionable business insights that impact every part of the business, including customer experience, revenue, operations, risk management and other functions. Data science has the potential to dramatically accelerate digital transformation initiatives, delivering greater performance and advantages over the competition. 

However, not all data science platforms and methodologies are created equal. The ability to use data science to make predictions and take decisions that optimize business outcome requires transparency and accountability. There are several underlying factors such as trust, having confidence in the prediction and understanding how the technology works, but fundamentally it comes down to whether the platform uses a black-box or white-box model approach. 

Black-box testing or processing is a method in which the internal structure/design/implementation of the item being tested is not known to the tester. White-box testing or processing is a method in which the internal structure/design/implementation of the item being tested is known to the tester.

Once the industry standard, black-box-type machine-learning projects tended to offer high degrees of accuracy, but they also generated minimal actionable insights and resulted in a lack of accountability in the data-driven decision-making process.

On the other hand, white-box models offer accuracy while also clearly explaining how they behave, how they produce predictions and what the influencing variables are. White-box models are preferred in many enterprise use cases because of their transparent “inner-working” modeling process and easily interpretable behavior.

Today, with the advent of autoML 2.0 platforms, a white-box model approach is becoming a trend for data science projects. In this eWEEK Data Points article, Ryohei Fujimaki, Ph.D. and founder and CEO of dotData, discusses five key factors why white-box data science models are superior to black-box models for deriving business value from data science. DotData is a provider of full-cycle data science automation.

Data Point No. 1: The machine-learning modeling process must be transparent.

It is important for both analytics and business teams to understand the varying levels of transparency and their relevance to the machine learning process. Linear and decision/regression tree models are fairly transparent in how they generate predictions. However, deep learning (deep neural network), boosting and random forest models are highly non-linear and difficult to explain for black-box models. While black-box models can have a slight edge in accuracy scores, white-box models offer far more business insights that are critical for enterprise data science projects. White-box transparency means that the exact logic and behavior needed to arrive at a final outcome is easily determined and understandable. 

Data Point No. 2: Features have to be interpretable.

Data scientists obviously are math-oriented and tend to create complex features that might be highly correlated with the prediction target. For example, consider the following feature vector for customer analytics: “log(age) * square-root(family income) / exp(height).” One will not be able to easily explain its logical meaning from the viewpoint of customer behaviors. In addition, deep learning (neural networks) computationally generates features. It is not possible to understand such deep non-linear transformations; thus, incorporating this type of feature will make the model a black box. In today’s regulatory environment, the need to explain the key variables driving business decisions is important. White-box models can fulfill this need and thus are gaining in popularity.

Data Point No. 3: Insights must be actionable for model consumers and business users.

Model consumers are using ML models on a daily basis and need to understand how and why a model made a particular prediction, to better plan how to respond to each prediction. Understanding how a score has been derived and what features contributed allows consumers to optimize their operations. For example, a black-box model may indicate that “Customer A is likely to churn within 30 days with a probability of 73.5%.” Without a stated reason for the churn, a business user will not have enough information to determine if the prediction is reasonable. In contrast, white-box models typically give a different type of answer, such as, “Customer A is likely to churn next month because Customer A contacted the customer service center five times last month and usage decreased by 25% in the past four months.” Having the specific reasoning behind the prediction makes it much easier to determine the validity of the prediction, as well as what action should be taken in response.

Data Point No. 4: Models must be explainable.

In enterprise data-science projects, data scientists and model developers have to explain how their models behave, the stability and the key variables that are driving the prediction model. Therefore, explainability is absolutely critical for model acceptance. White-box models produce prediction results alongside influencing variables, making predictions fully explainable. This is especially precarious in situations where a model is used to support a high-profile business decision or to replace an existing model. Model developers have to defend their models and justify model-based decisions to other business stakeholders.

Data Point No. 5: Accountability is critical.

As more organizations adopt data science into their business process, there are increasing concerns about accountability and decisions made based on information that is personal and can sometimes be interpreted as discriminatory. As they provide increased transparency and explainability, white-box models help organizations stay accountable for their data-driven decisions and maintain compliance with the law and any potential legal audits. In contrast, black-box models exacerbate this issue, where less is known about the influencing variables that are actually driving final decisions. 

If you have a suggestion for an eWEEK Data Points article, email cpreimesberger@eweek.com.

The post Why White-Box Models in Enterprise Data Science Work More Efficiently appeared first on eWEEK.

]]>
https://www.eweek.com/big-data-and-analytics/why-white-box-models-in-enterprise-data-science-work-more-efficiently/feed/ 0
How to Make Sure Your VPN Access Remains Seamless https://www.eweek.com/networking/how-to-make-sure-your-vpn-access-remains-seamless/ https://www.eweek.com/networking/how-to-make-sure-your-vpn-access-remains-seamless/#respond Wed, 29 Apr 2020 16:00:00 +0000 https://www.eweek.com/uncategorized/how-to-make-sure-your-vpn-access-remains-seamless/ Recent events have compelled organizations of all sizes and across industries to adopt new work approaches that keep employees safe at home while ensuring productivity and security. According to a report by Willis Towers Watson, nearly half (46%) of organizations are implementing work-from-home policies because of the COVID-19 pandemic. As a result, companies are relying […]

The post How to Make Sure Your VPN Access Remains Seamless appeared first on eWEEK.

]]>
Recent events have compelled organizations of all sizes and across industries to adopt new work approaches that keep employees safe at home while ensuring productivity and security. According to a report by Willis Towers Watson, nearly half (46%) of organizations are implementing work-from-home policies because of the COVID-19 pandemic. As a result, companies are relying on virtual private networks (VPNs), which establish encrypted connections to enterprise applications over the public internet, to connect their workforce.

In the past, VPNs were known to cause various levels of grief in many organizations because they can be tricky to implement and maintain. But they’re also very important components in enterprise security, and implementations have improved markedly in recent years when it comes to user-friendliness.

Many organizations have used VPNs for years to provide seamless connectivity without compromising security for employees who travel or work remotely. These VPN endpoints are typically set up to support 5% 10% of a company’s workforce at any given time. Ongoing VPN support for 100% of the workforce at companies around the world is unprecedented, and this “new normal” is putting unforeseen stress on both corporate and public networks.

There are important steps companies can take to address these challenges so that connecting to enterprise networks doesn’t leave employees frustrated during a time when stress levels are already high. These same best practices can support an enduring strategy for managing an increasingly mobile and remote workforce as the nature of work shifts.

This eWEEK Data Points article is based on industry information supplied by Karthik Krishnaswamy, director of product marketing at NS1.

Data Point No. 1: VPN Security

VPNs are intrinsically designed to be encrypted tunnels that protect traffic, making them a secure choice for enabling remote work. Even with the increased number of people connecting to VPNs, this remains true. However, cyber-criminals do take advantage of times of chaos to attack corporate infrastructure like VPNs.

The strategy cyber-criminals typically employ is to obtain a person’s network credentials to access the VPN and, by extension, the employer’s networks and systems.

With so many more VPN users, the pool of potential victims who lose their credentials is higher than ever before. Knowing this, companies can ensure they properly secure their VPNs by enabling and requiring two-factor authentication as a second layer of protection.

With two-factor authentication, even if a cyber-criminal obtains an employee’s login credentials, they won’t be able to access the VPN or network without additional information, such as a one-time-use security code sent to a preselected mobile number or, ideally, to a token application. While no security measure can 100% guarantee complete security, setting up two-factor authentication can make it much more difficult for a cyber-criminal to take advantage of increased VPN usage.

Data Point No. 2: Add New VPNs to Support Increased Demand

Once a company has secured its VPN endpoints, it may find that the current infrastructure does not adequately support its entire workforce. A report from Atlas VPN estimates that VPN usage could increase by 150% as the coronavirus continues to spread. Companies can manage the increased demand by adding endpoints in multiple regions to cope. Depending on the company’s VPN architecture, this can be done through a cloud provider by increasing seats, by adding licenses to the existing VPN hardware solution, or by purchasing and deploying new VPN servers. One may also be able to enable VPN capabilities on existing edge network devices. This may be a great short-term solution for some as it allows for an increase in capacity without incurring additional capital expenses.

Data Point No. 3: Ensure Positive Employee Experience With VPN Traffic Steering

While increasing the number of VPN servers will help ensure a company has the capacity to accommodate more employees working remotely, there may still be issues with performance or availability if all the users log in to the same VPN server.

To accommodate this increased demand, organizations can optimize VPN server use. In many cases, it is up to the employee to randomly choose an endpoint from a list. Employees continue connecting to a “default” endpoint for days or weeks, regardless of usage or capacity.

Worse yet, if the user cannot connect to their normal endpoint due to high traffic volume, the client will often select a backup without consideration to location or load, resulting in slowness or outright disconnections.

Data Point No. 4: Monitor Performance to Adapt as Needed

Lastly, continuous monitoring is a crucial step to making sure your VPN connections remain accessible and performant for employees. Many tools provide valuable insight that can help companies evaluate and adjust capacity as needs change. Consistent monitoring can also demonstrate trends about when employees are connecting the most often, and from which geographies. This allows companies to better plan for times of high volume, create strategies for when to add more VPNs based on employee growth plans and set up informed traffic routing rules, optimizing VPN usage long term.

By adding VPNs, traffic steering at the DNS layer, securing the endpoints and consistently monitoring performance, employers can deliver the same seamless network and technology experiences that employees expect when they are in the office. In a time of uncertainty and worry, this can help reduce the stress of working remotely while also creating a resilient network.

If you have a suggestion for an eWEEK Data Points article, email cpreimesberger@eweek.com.

The post How to Make Sure Your VPN Access Remains Seamless appeared first on eWEEK.

]]>
https://www.eweek.com/networking/how-to-make-sure-your-vpn-access-remains-seamless/feed/ 0