Wednesday, February 22, 2012

High Availability Basic Concepts-I

For any software application or service, high availability refers to the availability of that application or service to its users without any failure. For simplest example, Google is providing its search capabilities to all the Internet users (virtually) for 24 X 7 via its Google search engine. We assume that as soon as we switch on our PC or laptop (or any other compatible device), and connect to Internet, Google search will be available to us. The use of word virtually compensates for those small time periods, when Google search engine is not available to the users due to server maintenance, or some other reasons. This duration will be called as downtime, and is usually measured over a year. So if any application of service provider claims 99.9% availability, it means that over a year’s time, its services may be down for 0.1% duration of year, i.e. 8 hours and 45 minutes.

The primary goal of any high availability solution is to minimize the impact of downtime. And the Service Level Agreements for any such high availability solution (or service) always covers these clarifications in its terms and conditions. The availability of a solution (application or service, or a group of them) can be expressed as this calculation

Availability = ( Actual Uptime / Expected Uptime ) x 100:

The resulting value is often expressed by industry in terms of the number of 9’s that the solution provides; meant to convey an annual number of minutes of possible uptime, or conversely, minutes of downtime.

Number of 9’s

Availability Percentage

Total Annual Downtime

2

99%

3 days, 15 hours

3

99.9%

8 hours, 45 minutes

4

99.99%

52 minutes, 34 seconds

5

99.999%

5 minutes, 15 seconds

More details on various high availability applications and services will be covered very soon. (as soon as I'll have free time available for the same :)

Alright, got some time to generate some content as a blog Microsoft's High Availability Solutions.

Wednesday, February 15, 2012

Journey of Crystal Reports

For one of the recent assignments, I had to dig out the entire history of the world famous reporting product “Crystal Reports”, as it originated, transformed, and evolved into what we see as SAP Crystal Reports 2011 today. Here is the brief summary of the same:

Crystal Reports is a business intelligence application used to design and generate reports from a wide range of data sources. Started as Crystal Services Inc. in 1989, the company developed this product as a commercial report writing tool for their accounting software. They released three initial versions of the product as Quik Reports (1990), Quik Reports 2.0 (1991) and Quik Reports 3.0 (1992).

Then, after acquisition by Seagate Technology Inc in 1994, the company was named as Seagate Software. The product was also renamed and launched as Seagate Info 4.0 in 1995. In 1995, Seagate Software decided to have collaboration with Holistic Systems (acquired by Seagate technologies Inc.) forming Information Management Group of Seagate Software. Under this collaboration, the product was rebranded, and users enjoyed 5 versions, namely Crystal Reports 4.5 (1996), Crystal Info 5 (1997), Seagate Crystal Info 6 (1998), Seagate Info 7 (1999), and Seagate Info 7.5 (2000). (And yes, the name was changed almost after every release; they were just not able to find the right name for it!!).

In year 2001, the company Seagate Software was again renamed as Crystal Decisions. It then released the versions Crystal Enterprises 8.0, Crystal Enterprises 8.5 (2001) and then Crystal Enterprises 9.0 (2001) in quick successions.

Then it was again acquired by famous Business Intelligence solution provider BusinessObjects in 2003. The first version released after this acquisition carried the same naming format as its earlier trend, Crystal Enterprise 10.0 in year 2004. Then the product was released with revised names as Crystal Reports XI R1 (2005), and Crystal Reports XI R2 (2006).

With acquisition of BusinessObjects by SAP in 2007, this product again witnessed change of name. It was released as Crystal Report 2008 (2008).

The latest release is named as SAP Crystal Report 2011.

For details on future roadmap of this product, and what users can expect from SAP for this product, here is the SAP Crystal Reports 2011 and the 20-year roadmap.

Friday, February 10, 2012

Business Intelligence Technology Stack


Business Intelligence general refers to identification, extraction or transformation of business data into useful information (reports, charts, graphs etc.) to gain business specific insights like demand forecasts and sales predictions, thus providing better decision making capabilities. It usually refers to the computer based techniques, like reporting, analytics, data mining, benchmarking, predictive analysis etc., but is not limited to them.


As explained by D. J. Power in his work “A Brief History of Decision Support Systems”, there are various tools and technologies that provide Business Intelligence capabilities, and providing an efficient Decision Support System (DSS). His research covers even the basic systems like file drawers, which are used to keep information in organized and readily searchable manner (for small organizations). But in present information age, those kinds of systems seems outdated for requirements of a global organizations, with hundreds of branches across the world, and

generating and processing huge amount of information per hour. In this article, we are focusing only on computer based programs and applications, that consumes and processes the digital information available on organization’s servers, and then generates meaningful results out of it, which provokes better decisions from BDMs (Business Decision Makers), TDMs (Technology Decision Makers) or other IT Pros involved in decision making.

The well-known enterprise analyst organization Gartner predicts a five-fold growth in the Open-Source BI tools product deployment by the end of 2012. They also mentioned in their report on Magic Quadrant for Business Intelligence Platforms 2011, that the growth in BI will be driven by factors like Consumerization of BI and support for extreme data performance with emerging data sources (known as Big Data). And with some recent break-through innovations by the major BI vendors (like SAP’s HANA appliance, Oracle’s Exalytics appliances,

and Microsoft’s BISM model), IT world may expect more surprises coming from the major BI vendors (including but not limited to Microsoft, Oracle, microstrategy, IBM, Information Builders, QlikTech, SAP and SAS).

But irrespective of vendor, all BI solutions have a generic technology stack, with following layers:

· User Interface: This includes the Web based or application based frontend that brings the analysis to the users. It includes the portals (in case of networked or web-based analytics) or the application front end in case of locally deployed BI solution.

· Development and Admin Tools: This comprises of the tools, languages and processes involved in the development and management of BI applications and systems. The difference between BI systems and BI solutions will be covered in another blog. For example, some BI development languages can be MultiDimensional eXpressions (MDX), XML for Analysis (XMLA), Data Mining Extensions (DMX) etc.

· BI Tools: This comprises of the tools (reports, dashboards or otherwise) that enables the users to perform the desired analysis on the underlying data. User access these tools via the User Interface layer discussed above. For instance, Microsoft’s PowerPivot and Power View, SAP’s crystal reports, Jaspersoft, Oracle’s Business Intelligence Foundation Suite are just few examples to name, there are more than 100 of readily usable vendor products available in the market.

· Applications and BI data sources: This comprises of the various sources that keep the information in pre-processed form that can be readily consumed for analysis. This includes models like Online Analytical Processing (OLAP) Cubes or Decision Support Systems, and concepts like Data Mining, Analysis Services, etc.

· Data Integration Tools: This comprises of the various data management tools and concepts like Master Data Management (covering data collection, source identification, schema mapping, normalization, data transformation, rule administration, error detection and correction, data consolidation, data storage, data distribution, data classification, item master creation, data enrichment and data governance) and services like taxonomy services, Data Quality Services,

· Data warehouse platform: This comprises of various data sources, including simple text based files, excel sheets, relational databases, or even complex unstructured data types like audio files, videos, web-logs, click-streams and geo-spatial data etc.

Wednesday, January 25, 2012

Brief history of Microsoft SQL Server

Just got curious about the entire history of how SQL server evolved since its birth. Here is a short blog reflecting the research.

For an interesting story on how SQL server evolved, please refer the document History of SQL Server.

A brief history of SQL Server is available in the table below (along with the relevant links to corresponding resources):

Year

SQL Server Version

Code Name

2012

SQL Server 2012

Denali

2010

SQL Server 2008 R2

Kilimanjaro (aka KJ)

2010

SQL Azure

Matrix (aka CloudDB)

2008

SQL Server 2008

Katmai

2005

SQL Server Integration Services (formerly Data Transformation Services)

2005

SQL Server 2005

Yukon

2004

SQL Server 2000 Reporting Service

2003

SQL Server 2000 (64-bit Edition)

Liberty

2000

SQL Server 2000 Analysis Services

Shiloh

2000

SQL Server 2000

1999

SQL Server 7.0 OLAP Services (including Data Transformation Services)

Plato

1998

SQL Server 7.0

Sphinx

1996

SQL Server 6.5

Hydra

1995

SQL Server 6.0

SQL95

1993

SQL Server 4.2 (32 bit Edition)

SQLNT

1991

SQL Server 1.1

1989

SQL Server 1.0 (16bit)

Ashton-Tate/Microsoft SQL Server

Also details of some of the specific release dates and build numbers are available on MSDN link.

Some guidelines on upgrade paths (till SQL Server 2008 R2) are available on MSDN here.

More technical details about each version is available here.

Tuesday, January 3, 2012

How much a private cloud costs?


Microsoft recently showcased the power of its private cloud in a TechEd event in North America, by building one onsite and running Hands on Lab on it. It included hardware mostly from Microsoft’s partner HP, and Microsoft Software stack. The video for the same is available here.
To get a better understanding of this private cloud (in terms of cost), we did a simple cost analysis of hardware infrastructure used for developing such a datacenter. This includes a simple blade system based datacenter, capable of serving around 1500 client systems. The datacenter hosts a huge number of virtual machines (VMs) which are in fact the pre-configured environment for Hands on Lab (HOL) for learning different technologies. These HOLs can be accessed by each of the client systems as and when required. Upon request, a copy of the VM is sent to the client machine, which becomes the personal copy for that client, but still running on the powerful servers of the Datacenter only. No actual processing is done at client's end, except for simple Internet Explorer based application sending and receiving the request to and from the datacenter.
Following are the rough figures explaining the hardware infrastructure along with cost estimate of the private cloud.
Sample Private Cloud Datacenter configuration and cost:
Blade Enclosure Unit
HP BladeSystem c7000 Enclosure with Flex-Fabric
$24,399.00
HP BladeSystemOnboard Administrator
-Enclosure Management
-Flex Fabric Management
-HP Integrated Lights-Out (ILO)
$899.00
Blade Server
HP Proliant BL460c G7 Server Blade (model considered : HP BL460c G7 L5640 1P Svr (603256-B21)
-16 Server blades
-2P * 6 cores
-128 GB RAM
-2*146 GB HDD
-BL460c G7 - 128 GB RAM
$333,180 per blade * 16 = $5,330,880
HP IO Accelerator Card in each Blade (320 GB)
- Solid State Disk PCI Card
- 145,000 IOPS
- Read: 750 MB/s
- Write: 550 MB/s
-Sizes (GB): 640 or 320
$ 9,719 per Blade * 16 =
$155,504
SAN Disks
HP Storageworks 4400 EVA Fibre Channel SAN
- 4/8 Gbps Fibre Channel
-Dual Controllers
-Dual Embedded SAN Switches
$13,839.20 including 8 HDDs
Additional HDD
- 40 * 300 GB 15K RPM Disks
$1,491 * 40
= $59,640
Network
HP FlexFabric Network
- Converged Infrastructure
-Virtual Connect Technology
-4/8 Gbps Fibre Channel
-1/10 Gbps Ethernet
Varies depending on infrastructure location and size
The total Infrastructure cost for this sample Private cloud datacenter turns out to be ~$5,585,161. The cost for software, installation, maintenance and upgrades would be further added to get the complete TCO for a private cloud.
This private cloud provides many benefits like scalability, self service capabilities, high customization, security, and reduced operational costs. So for the enterprises investing billions of dollars for their Infrastructure, this seems to be a reasonable price for all the benefits they gain, but for medium and small industries, this seems to be an investment into the unforeseen future.
(These are just indicative price taken from the mentioned sources. In case of any concerns/discrimination in price or product, or the calculations at any place, we are open for corrections. Do let us know your feedback for the same.)

Friday, December 30, 2011

Top 10 cloud computing providers of 2011

As already expected at the beginning of year 2011, world saw many transitions, transformations, and turnarounds in the cloud technologies from various vendors across the globe. A summarized report of all the predictions made by various analysts is available here.

And the list of most substantial players, as already mentioned in an earlier post, comes out to be as follows:

1) Amazon
For the second year in a row, the king of cloud is still Amazon Web Services. No other company has come close to the cloud-based innovation AWS provides.
Even Eli Lilly taking some of its business elsewhere ended up doing AWS a favor. Since that debacle over SLAs, Amazon has stepped up its support and now offers a premium "white glove" service that routes your call to the nearest engineering specialist.

2) Verizon/Terremark
Charging into the number two position on our list is Verizon. The telco giant had previously built its own cloud; high-quality stuff but with a commensurate price. The Four Seasons of cloud, if you will: snooty service, small menu, long waits for a reservation and eye-watering bill. It was a test run, and apparently Verizon decided it needed some expertise instead of re-inventing the wheel.
Verizon then bought Terremark, much as you or I would buy a coffee and a bagel. Not only is Terremark one of the premier Tier 1 hosters in the world, it's also a cloud supplier to the coveted enterprise market, effectively moving Verizon into the top ranks.
Bigger than almost any competitor and with all the pipe in the world (literally), Verizon could be the King Kong of cloud. We’ll see if they can make it work or if Terremark Cloud is doomed for post-acquisition mishandling.

3) IBM
New to the list is IBM with its Smart Business Test and Development Cloud. While Big Blue might be lugging a hundred years of IT baggage, it has finally launched Infrastructure as a Service, although initially just for test and development purposes.
Despite its convoluted, muddled strategy in the cloud market, the Test and Dev service is winning enterprise business, which after all is IBM’s meat and potatoes. IBM reportedly earned $30 million in cloud revenue last year; few others have the scale of the enterprise user base to ramp up that fast.

4) Salesforce.com
Salesforce.com maintains a spot in the top five, thanks to its acquisition of Heroku. The company singlehandedly forced its way into the Platform as a Service market with this buy, and it will give them legs to hit customers not interested in the patented "Salesforce.com maximum lock-in" feature offered on Force.com and their CRM platform.
The Software as a Service market is rapidly coalescing into a mature, well-defined space; props to Salesforce.com for grabbing on to something that keeps it relevant going forward.

5) CSC
CSC, the IT integrator and service provider has cooked up an interesting private cloud service called BizCloud. The company will wheel VCE -- the giant cloud-in-a-box system from VMware, Cisco and EMC -- into your IT shop. Ten weeks later, it will be integrated into all your messy, legacy IT systems, turning on Infrastructure as a Service. CSC then manages your hardware; for extra capacity, you can hook into a public cloud service, also running on VCE.
CSC points to the trend of enterprises looking for practical ways to use (and get to) cloud computing; the company also performs massive-scale integrations with Google Apps and other Software as a Service players. As a bridge to the cloud for many enterprises, CSC is on the front lines and on our top 10 list.

6) Rackspace
Even though Rackspace fell in the ranks from last year's list, it's still the number two cloud provider after Amazon in terms of revenue. It might even be coming close in terms of its user base, a remarkable feat.
But aside from the feel-good soft launch of OpenStack last year, it's still business as usual. The company hasn't made any major renovations to the service, something that may change as Rackspace absorbs cloud management technology firm Cloudkick.

7) Google
Since our initial list, Google App Engine has won lots of business among Web, gaming and mobile companies, but similarly has yet to make any impact among enterprise developers. We talked to the Google App Engine team recently and they are working on adding features, including an SLA and a hosted SQL service that Google hopes will attract the enterprise developer audience.
The company is also reportedly hiring 6,000 warm bodies in 2011, most likely to supply that crucial enterprise support Google has so notably lacked. Can’t win the cloud with foosball and beanbags, kids; put your big-boy clothes on and get ready for real customers. The race is on with Microsoft!

8) BlueLock
BlueLock is a small-scale provider that's been a key testbed for VMware’s vCloud Express. It even pioneered a tool to help customers get out of their ESX bubbles and mix in vCloud resources, something VMware hadn’t been able to do. BlueLock’s Indiana facilities should soon become a major local employer, as it's now a key VMware/VCE provider and likely to see continued growth.

9) Microsoft
Microsoft is on a bit of a downward spiral. While the software giant has made a song and dance about its Azure cloud service, claiming 31,000 companies are customers, we’ve yet to see any significant traction among enterprise IT developers. Web companies, mobile companies, tech and social networking firms use it, sure, but so far there’s no standout among traditional enterprises.
Meanwhile, Microsoft’s cloud business is in turmoil, as Steve Ballmer has purged many of the company's key leaders. Software architect Ray Ozzie is out. Bob Muglia and Amitabh Srivastava from Server and Tools (and Azure) have left. Dave Thompson (Office Online, Office 365) is gone.
Now at the helm is veteran Satya Nadella, who ran Microsoft’s unremarkable ERP and CRM efforts. These people didn’t fail -- they built everything Microsoft can legitimately call "cloud" -- but they’ve been cut loose before their work had a chance to prove itself.

10) Joyent
We bumped a few other providers off the list due to lack of activity (business as usual doesn’t cut it). Joyent, however, kept a spot in the top 10 by releasing its platform software and forming a partnership with Dell to sell pre-configured cloud infrastructure packages.
It's a nice way to push the model -- use the Joyent service, or build your own if you like the technology but not the public option. This may be the direction the market is headed, as more and more businesses want to adopt cloud computing within their infrastructure.

(Source: http://searchcloudcomputing.techtarget.com/feature/Amazon-2011-top-cloud-computing-provider#slideshow)

Thursday, December 22, 2011

IT Skills vs Real Education

Recently, an article was published on one of the technology portals about the 'Top 11 skills of 2011". On the basis of the results of questionnaires they prepared, and the answers they received, they prepared a list the top skills for 2011. But the results, according to me, were bit misleading.

These are just result of programming contests, mostly attended by college students or employees on bench in IT companies. For them, these are the familer, simplest, and comfortable languages. In practice, different results can be found for hot technologies at different sites like: http://www.sap-img.com/the-top-it-skills-to-have.htm

But in real, the IT skills contains bit different set of names all together. Its much beyon just few programming languages, like Problem-solving, Training skills, Foreign language communication skill, etc, along with domain knowledge like Networking, Security, Telecommunications, Business Intelligence etc. There are few sites where i found mention of such skills that talk about them are people daily, itcareerfinder etc., but there are very few paths or search algorithms that brings these skills higher in the list of Essential IT Skills.

And if you are at much begginer level, you need to get the basic skills like team work, communication and co-operation, i am glad there is a small number of links promoting that too.

If you have these, you can learn any language (C, java etc.) within few weeks.

Again repeating the nice thought for Great Chanakya
"Education is the Best Friend.
An Educated Person is Respected Everywhere.
Education beats the Beauty and the Youth."

But the thing to be noticed here is about the real meaning of education.
Its not just having a Bachelor's or Master's degree. Its something more than that. And that can not be taught in books, but is learned with the real life experiences.

Total Pageviews