July 05, 2011 |
The effects of deploying virtualization technologies reach far beyond the parts of the infrastructure that are being virtualized by altering the roles of traditional infrastructure components. Virtualizing servers and desktops changes how organizations go about providing IT services, but also changes how the performance of IT services is being managed across the entire delivery chain. Sixty-two percent of organizations that participated in TRAC's recent survey reported that lack of visibility across the IT service delivery chain is one of the key challenges for managing IT performance in virtualized environments. As a result, organizations are increasingly realizing that ensuring optimal performance of their networks becomes even more important once they have virtualized their infrastructure. Furthermore, managing network performance in virtualized environments requires network management systems (NMS) to be enhanced with a new set of capabilities so they can address key challenges of virtualization management.
This report from TRAC Research analyzes the impact of virtualization technologies on network management and examines capabilities that are becoming increasingly important in virtualized environments.
Click here to download the report
|
Written by Bojan Simic |
November 29, 2010 |
Industry analysts tend to classify vendors into technology "buckets" and create "labels" for each of them, as that makes it easier to compare products, capture key trends and provide context around problems that these products are addressing. This method also resonates with some technology marketers, as it allows them to partially benefit from promotions that other vendors and media are conducting around a label of a technology bucket their product was put into.
The term "application performance management" (APM) has been one of the hottest technology "labels" over the last few years. Performance of enterprise applications impacts nearly all of the key business goals, and it shouldn't come as a surprise that technology solutions for managing performance of these applications has been very high on IT agendas. With that said, it should be even less of a surprise that technology vendors, who are involved in managing the delivery of applications to business users in any way, realized this opportunity and started calling themselves "APM vendors". However, every "hot" industry term has an expiration date attached to it and sometimes it doesn't take long for a company to go from being one of the biggest promoters of an industry term to getting to the point where it doesn't even want to be associated with it.
Back in 2008, there were more than 50 technology vendors that used the term APM to position products that they provide and that number is now down to less than 30. So, had these 20+ companies gone out of business or completely changed their product portfolios? No, but they had realized that the term APM got diluted and that it is in their best interest to separate themselves from technologies that address the same problem as they do, only from a different perspective.
Being thrown into the same technology bucket with companies that are addressing a similar problem from a different perspective could be a major challenge for many technology companies. Organizations that are in this position typically have two options: 1) wait until the market matures to the point when it becomes obvious that their solution is significantly different than other products in the same "bucket", or 2) coin a new term to describe a category in which their product belongs, promote the heck out of it and hope that it will become an industry accepted term. It took a combination of these two approaches to somewhat change the boundaries of the APM "bucket". That resulted in more market awareness about the differences between two groups of products that are also addressing challenges of managing application performance, but doing it from different perspectives: end-user experience monitoring and business transaction management (BTM).
The increased interest of end-user organizations in having visibility into how their applications are performing, not only from the perspective of their IT departments but from the perspective of business users, resulted in more market awareness about the role that end-user experience monitoring solutions are playing in managing application performance. The market matured enough to become more aware of the fact that different flavors of technologies for monitoring the quality of end-user experience, such as those provided by Aternity, Knoa Software, Coradiant or AlertSite, do not compete against, but complement vendors such as OPNET, OpTier or Quest's Foglight.
On the other side, vendors that specialize in managing application performance from a business transaction perspective also found a way to raise awareness about the differences between their solutions and many other APM products. This resulted in an increased adoption of the term BTM when describing capabilities of these solutions. These solutions are taking a different approach when addressing issues with application performance, as compared to some other APM vendors, and enable organizations to monitor the performance of applications across an entire transaction flow. Some of the vendors that fall in this group include OpTier, Nastel, INETCO, Correlsense, Precise Software, dynaTrace and AmberPoint (acquired by Oracle).
|
Read more... |
Written by Bojan Simic |
September 15, 2010 |
Today, Keynote Systems and dynaTrace announced a strategic partnership that would allow end-users to leverage these two solutions in an integrated fashion. This is the second strategic partnership that Keynote has created in this space over the last two months, as they formed a similar type of relationship with OPNET Technologies that was announced on July 22. These partnerships might be confusing to some, as it might seem that these three companies are essentially doing the same thing: to monitor the performance of business-critical applications. However, while Keynote is specializing in monitoring the quality of end-user experience and performance testing for Web applications from the outside of the corporate firewall, OPNET and dynaTrace provide solutions for monitoring application performance across enterprise infrastructure inside of the firewall.
The general perception of solutions for end-user experience monitoring, such as the one that Keynote is providing, is that they are very effective in identifying when business users are experiencing problems with application performance, but they are not as effective in drilling down into parts of the application delivery chain to isolate and resolve the root cause of the problem. On the other hand, tools for monitoring the performance of internal infrastructure, such as OPNET or dynaTrace, are able to monitor the transaction flow of applications across the network and into the data center, and provide a deeper dive into how applications are performing, what is causing performance problems and how they can be prevented and resolved. TRAC’s recent report “10 Things to Consider When Evaluating End-User Monitoring Solutions” revealed that the ability to integrate tools for monitoring the quality of end-user experience with tools for monitoring enterprise infrastructure is one of the key aspects of having full visibility into application performance. With that said, there is a clear value that end-user organizations can experience when products that include robust capabilities for application performance management (such as OPNET and dynaTrace) get integrated with one of the leading solutions for end-user experience monitoring from outside of the firewall (Keynote).
However, in order to evaluate a true significance of these partnerships, they should be analyzed in the context of some of the key dynamics in this market.
|
Read more... |
Written by Bojan Simic |
May 25, 2010 |
One of the emerging trends in IT performance management is that the proliferation of SaaS and cloud computing technologies are changing how organizations go about using and managing IT services. These trends are adding a new dimension to service level and performance monitoring and organizations are increasingly expecting a similar level of flexibility from their management tools as they are getting from their SaaS and cloud deployments. This also opens up new opportunities for management vendors to differentiate themselves from the competition and increase their presence in new markets by acquiring technologies that are well positioned to address new management challenges.
Our recent article highlighted two technology companies that are likely acquisition targets based on their technology, alignment with key market trends and the ability of their solutions to fill in technology and go-to-market gaps that larger vendors currently have. In part two of this series, we are covering two additional companies that meet the same criteria.
Again, this listing is not based on any inside information.
|
Read more... |
May 02, 2010 |
TRAC Research recently recorded a podcast about the key trends in the load testing market with Priya Kothari, Product Marketing Manager for HP’s Performance Validation solutions. Some of the topics covered in this podcast include: using load testing solutions to align IT with business goals, internal processes that organizations need to have in place to get the most out of their load testing solutions and the role that load testing technology plays in deploying cloud computing services.
Here are some of the key insights from the podcast:
“Finding the tools to actually do your testing is the easy part, but implementing proper load testing practices is often the hard part… In order to do a performance test you’ll need to know what the application is built for. Meaning, what is the business purpose of the applications, what will users be doing on the applications, what types of transactions will be performed and how many users will be accessing that application? You also need to know what users will be expecting from the application and what types of service level objectives should be in place and tested for. It is also important to understand what pieces of the applications are the most critical. If they have this type of information, testers can then accurately plan the testing to ensure that high priority requirements are always covered… This really helps to put an end to testing for testing sake, but rather align IT testing teams with the needs of various business stakeholders.”
“A lot of people believe that moving to the cloud actually means that you don’t have to worry about performance testing, since now they have access to unlimited hardware. What they often don’t realize is, that if the application itself is not scalable, then the elasticity of the cloud can actually cost them thousands of dollars. If the application is not scalable you will be using new machines to support the load. When moving to the cloud, it becomes even more important to test your applications and to tune them properly so they are optimized when it comes to hardware consumption… With the hybrid cloud, there is a new factor that organizations need to consider: you need to ensure that you have enough bandwidth between yourself and the cloud provider. Also, cloud vendors themselves need to start thinking about load testing. They need to test their infrastructure for specific usage conditions and ensure that they are not going to be a bottleneck.”
“Application modernization is one of the key trends that we are seeing. We are seeing more customers that are moving away from legacy technologies and move to rich internet applications, Web 2.0 and SOA-based applications, as well as frameworks such as AJAX, Flex and Silverlight. We are also seeing that some of the major application providers themselves are picking on these trends. For example, SAP is starting to use Flash, Flex and Silverlight in their latest releases. We are also seeing a browser explosion. It used to be all Internet Explorer. Now we are seeing more of Firefox, Google Chrome, Safari and Opera. Everyone is looking to create a richer end-user experience from using their applications to become more competitive in the marketplace.”
Click here to listen to the podcast
|
March 15, 2010 |
On March 10th, 2010 CA announced that it had agreed to acquire Nimsoft, an IT service management company, for approximately $350 million. The acquisition is expected to close by the end of March of 2010, and Nimsoft will operate as a separate business unit. The goal of this report is to examine the impact of this acquisition on the IT service delivery market and the impact it could have on customers and prospects of CA and Nimsoft.
Click here to download a complimentary copy of the report
|
Written by Bojan Simic |
January 26, 2010 |
Once in a while, IT management vendors pick up a theme that their customers are very interested in; they start building their marketing messaging around it, write white papers about it, and have it all over their websites. Before you know it, what originally was a legitimate request from end-user organizations for addressing challenges that they have, it becomes a marketing term that is very difficult to define for end-users. “Aligning IT with business” is becoming a very good example of that.
The fact is, the majority of end-user organizations are still struggling to come up with a set of metrics that would help them understand how their IT initiatives are contributing to their business goals. These organizations are allocating a significant part of their enterprise budgets to their IT initiatives and they need to figure out:
- How their past investments in IT are contributing to their bottom line
- What criteria they should be using when evaluating the value of new technology investments
- How to prioritize their current IT management initiatives
So the need to align IT is a true pain point for end-user organizations and they are willing to invest in technology that will help them with that. But what technology is the best fit?
|
Read more... |
|