Skip to main content

Availability Manager Functions & AM Metrics

Availability Managers ensure that the service provided from the service catalog are met and delivered to the customer's utmost satisfaction. The primary responsibilities of the availability manager include:

  1. Ensuring that all service deliveries fulfill and match the established SLA's
  2. Ensure and categorize that all levels of availability of services, escalation and solutions are provided in time
  3. Validating that final designs meet minimum required levels of availability
  4. Assist in the exploration of all incidents and problems that cause availability issues.
  5. Participate in the IT infrastructure design that entail hardware and software in relation to availability fulfillments.
  6. Finally establish and use Availability metrics to measure availability, reliability and maintainability of services.
I think the last point entails a major percentage of the Availability manager's job and its important then to list the most commonly used metrics that a AM uses. To measure availability of a services one can subtract the amount of downtime from the Agreed service Time (AST), divide the result by the AST, and then multiply this number by 100 to obtain a percentage.

The reliability of a service can be measured in either MTBSI or MTBF which is mean time between service incidents and mean time between failures. To calculate the reliability of a service in MTBSI you divide the available time in hours by the number of breaks in service availability.

The maintainability of a service is measured in MTRS which is the mean time to restore a service. You calculate the MTRS by dividing the total downtime in hours by the number of service breaks.

On the note of delivering  the best service.

Sam Kurien


Popular posts from this blog

IT as a Innovation Partner in Business

Usually in Business organizations and especially in organizations where R&D is a separate department itself a tension persists on keeping the IT department away from any decision when it comes to innovation or process improvement. In short the IT department is generally seen as less of a help and more of a hindrance to innovation efforts. One of the main reasons is traditionally information systems are designed to impose structure on process, achieve pre-defined goals, produce metrics and minimize need for human interaction (in some case over maximize human interaction leading to nothing but "meetings").

While Innovation activities are highly unstructured and emergent, IT cannot be ignored or kept in isolation because IT can help in visualization tools, data mining efforts, uncover hidden relationships between data and create tools of knowledge management/information repository that so desperately is needed cross functionally but especially by the innovators within a org…

Analysis of SAP’s Platform Strategy

The software industry has been through high and lows up with the constant advent of new technological innovations and rapid changes in the global economic landscapes. SAP is the leading enterprise application software giant started by Hasso Plattner. The rise of Enterprise application industries started in early eighty’s with organizations needing one single software program that was capable of serving the multiple needs and functions of various departments. One single enterprise-wide application software means integrating applications that fused together for the smooth exchange and extraction of information. For example when customer services sold a product and got stock updated in the inventory by the warehouse people and the same data could be pulled by the Finance department. Enterprise Application software’s were designed exactly to do the latter mentioned processes seamlessly. SAP started by break away engineer’s Plattner and group build the company on strong engineering fort…

How Dashboards can mislead

Read an interesting article from John Shapiro professor at Northwestern Kellog on how dashboards can mislead executives and I cannot agree more. To be honest, I love visualization of data and have pushed my data architects and report writers to give me snapshots of various measures but how often the rich data didn't mean anything as it did not align with organizational goals. Even more, what information is important to me is not necessarily relevant to other executives in the organization.  Data analytics visualized on dashboards typically describe existing measures on past phenomena, some better ones predict future events and past data and the best one prescribe a course of corrective or strategic actions.

Shapiro talks about three types of traps executives can fall for:

1. The Context Trap:  We equate empirical data to the objective. I have blatantly used the cliche "numbers don't lie." But this belief can be dangerous because we can track wrong measures or metrics…