Blog


CCS Global Tech is where ideas, innovation, and positive energy combine to kick-start your career and take you to new professional heights.
Database Technology for Large Scale Data
07 November 2016
In the past 5 years many specialized DBMS have been introduced, as well as database systems with new storage schemes. These can be categorized by the type of the applied technology. In this blog, we will explain the four most popular types of database systems currently used by top corporations like Yahoo, Facebook, Google, Microsoft and Amazon, etc.
I'm a news headline
Add Date here
Add News Story here
QLIKVIEW
07 October 2016
QlikView is a leading Business Discovery Platform. It is very powerful in visually analyzing the relationships between data. It does in-memory data processing stores data in the report itself which it creates. It can read data from numerous sources including files and relational databases. It is used by businesses to get deeper insight by doing advanced analytics on the data they have. It even does data integration by combining data from various sources into one QlikView analysis document.
Team Foundation Server (TFS)
30 September 2016
Team Foundation Server (commonly abbreviated to TFS) is a Microsoft product that provides source code management (either with Team Foundation Version Control or Git), reporting, requirements management, project management (for both agile software development and waterfall teams), automated builds, lab management, testing and release management capabilities. It covers the entire application lifecycle, and enables DevOps capabilities. TFS can be used as a back-end to numerous integrated development environments (IDEs) but is tailored for Microsoft Visual Studio and Eclipse on all platforms.
I'm a news headline
Add Date here
Add News Story here
Tableau Software
23 September 2016
Tableau Software is an American computer software company headquartered in Seattle, Washington. It produces a family of interactive data visualization products focused on business intelligence. Tableau can help anyone see and understand their data. Connect to almost any database, drag and drop to create visualizations, and it can be shared with a click.
BUSINESS INTELLIGENCE DASHBOARD
15 September 2016
A business intelligence dashboard is a data visualization tool that displays the current status of metrics and key performance indicators (KPIs) for an enterprise. Dashboards consolidate and arrange numbers, metrics and sometimes performance scorecards on a single screen.
Business Intelligence vs. Business Analytics: What’s The Difference?
26 August 2016
Business intelligence and business analytics… aren’t they the same thing?
Or are they describing opposite processes?
There are a lot of big words that get thrown around in the world of BI, and it’s easy to get lost in a whirlwind of interpretation
In this article we will explore more about the difference
What’s new in SQL SERVER 2016
19 August 2016
There is a lot of buzz around SQL Server 2016. Microsoft announced the release of SQL Server 2016 at the Microsoft Ignite Conference during the first week of May 2015. Since that time a number of Community Technical Previews (CTPs) have come out. We can review some of the new features. In this article we are exploring, at a very high level, 10 of those new features.
BIG DATA
5 August 2016
The promise of data--driven decision--making is now being recognized broadly, and there is growing enthusiasm for the notion of ``Big Data.’’
We are awash in a flood of data today. In a broad range of application areas, data is being collected
at unprecedented scale. Decisions that previously were based on guesswork, or on painstakingly constructed models of reality, can now be made based on the data itself. Such Big Data analysis now drives nearly every aspect of our modern society, including mobile services, retail, manufacturing, financial services, life sciences, and physical sciences.
Cloud Computing: Alternative sourcing Strategy for Business
28 July 2016
The term "Cloud Computing" has been mentioned for just under two years in relation to services or infrastructural resources, which can be contracted over a network. Thus, the idea of renting instead of buying IT is nothing new. And so, Cloud Computing has many antecedents and equally as many attempts to define it. The players in the large world of clouds are Software as a Service providers, outsourcing and hosting providers, network and IT infrastructure providers and, above all, the companies whose names are closely linked with the Internet's commercial boom. But, all these services in combination outline the complete package known as Cloud Computing – depending on the source with the appropriate focus.
Use Bulk Operations and Minimize Logging
22 July 2016
In the data flow task use the fast load options to put data in the target tables.
For fast ETL the foreign keys can be removed to perform insert/update in parallel.
If it is needed badly best practice is to drop all the foreign keys and then create it again once data pull is complete. Even if large data is getting changed in the target table dropping primary keys and other indexes is also a part of best practices.
Ensure using the TABLOCK if target table is not in frequent use and disabling check constraint.
Ensure describing values for “Maximum insert commit size” to proper to ensure batch wise inserts.
Prefer using NOLOCK for lookup and other tasks to minimize locking impacts.
Uses of SQL Server partitioning and the SWITCH statement increase the performance by several folds while working with large data. Minimize the SSIS blocking and slow performing operations e.g. sort, aggregations like SUM, MIN, MAX etc.
I'm a news headline
Add Date here
Add News Story here
Best Practices While Loading Oracle Data Using SSIS
29th June, 2016
With the changing dynamics of technologies extraction data from the heterogeneous databases is enterprise need. Microsoft SSIS serves as one of the good ETL tools to extract data from multiple and heterogeneous sources. Here are some of the considerations and best practices to be taken care during the pull.
Pulling Data from Oracle
There are several methods to pull data from Oracle. Below are some more points of consideration while pulling data from Oracle.
Choosing a Provider
Choosing a provider can drastically change the speed of the data pull by multiple folds hence it becomes one of the most critical and the important part while planning ETL. Currently below are the options which can be used.
Converting a Simple SQL Server Native Client OLE DB Application to ODBC
26th May, 2015
The application flow in ODBC is similar to the application flow in OLE DB. In both cases, the application:
-
Initializes the environment.
-
Connects to a data source.
-
Creates and executes a command.
-
Processes results, if any.
- Cleans up.
SQL Server 2008 High Availability Options
27th March, 2015
There are 4 High Availability options in SQL Server 2008 that we will cover; Failover
Clustering, Database Mirroring, Log Shipping and Replication. Each one has its own features
and benefits.
As you may know, the Recovery Time Objective (RTO) which is the tolerable maximum length
of time that a system can be unavailable and Recovery Point Objective (RPO) which is how
much data can be lost, need to be considered to meet the organizational objectives for each
application that is critical to the business. Any desired High Availability option should satisfy
these objectives.
Mission Critical Performance and Scale with SQL Server andWindows Server
13th March, 2015
Data is growing everywhere, and the amplification of data affects every imaginable device, application,
and process as our world rapidly evolves into digital and virtual transactions and experiences. It is no
longer the case that organizations serve customers during standard business hours in single time zone or
geography. Today, services are continually available to customers through a range of operational
measures, from Internet presence at the very minimum to tracking complex operations globally for the
highest efficiencies and customer satisfaction.
Unit testing SQL Server 2K8 Database project
13th March, 2015
This is all about unit testing SQL Server 2008 Database project using Visual Studio 2010. As Unit tests
test the part of the program integration testing becomes easier, moreover unit tests help enhancing,
maintaining or extending a solution provided they are well written.
Building a Dashboard in SQL Server Reporting Services
3rd March, 2015
Your data warehouse is rock solid. You have multiple sets of fantastic reports. Departments
utilize their own designated reporting areas, and you've increased productivity and value
(ROI on the BI program) by focusing operations on critical areas by offering specialized
filters and sorting as well as exception reports
Curing Data-Obesity in OLTP Databases
20th February, 2015
It is quite common to have an OLTP database that must store large amounts of data which pile up into
hundreds of millions, even billions, of rows. What do we do in such cases? In this article I will describe a
way to deal with constant flows of OLTP data into production systems, and how to offload this data,
describing the process from beginning to end.
Backup and Restore Strategies in SQL Server
17th February, 2015
There are several high availability solutions that can be used with SQL Server, like AlwaysOn, Fail-over clustering, or Database mirroring. While these high availability solutions ensure maximum uptime for your databases, you need to setup backup and restore strategies to recover the data or minimize the risk of data loss in case a failure happens.
SSRS Textbox Tips and Tricks
9th February, 2015
SQL Server Reporting Services (SSRS) offers several different options for working with textboxes on a report. Some of these options include the following three items:
-
Adding multiple textboxes to a single cell on a tablix.
-
Adding a chart or graph to a tablix.
-
Special formatting options and adding line feed / carriage returns.
Database as a Service Reference Architecture Guide
January 23, 2015
This guide to building the infrastructure for hosting Microsoft® SQL Server® Database as a Service (DBaaS) is not limited to a particular type of hardware. By using the features of SQL Server 2014 and Hyper-V® virtual machines with Microsoft System Center 2012, a hosting service provider can start with very small tenant databases and scale out or scale up to meet the needs of the largest and busiest SQL Server applications. This reference architecture includes hardware, software, system design, and component configuration.
Comparison between SSRS and Crystal Report (CR)
January 19, 2015
Below are my observations on some of the differences between CR/CE and SSRS. These
observations are based on the use of these software packages for ad hoc report design and
web delivery of the reports. I don’t do any standalone application development and can’t
speak to their capabilities in these areas. This list is by no means complete and some
things are completely a matter of opinion. I hope this can help you in your decision.
What Changed in SQL Server 2014?
December 30, 2014
In this section, we discuss the primary changes made to the CE component for SQL Server 2014. The legacy CE’s default behavior has remained relatively static since SQL Server 7.0. The SQL Server 2014 changes address performance issues for key scenarios based on benchmark and customer workload testing. The new CE design also lays the groundwork for future improvements.
Migrating to AlwaysOn Availability Groups from Prior Deployments Combining Database Mirroring and Log Shipping
December 22, 2014
This migration guide provides guidance for customers who have deployed database mirroring for local high availability and log shipping for disaster recovery based on SQL Server 2008 R2 or earlier and now want to upgrade to use SQL Server 2012 AlwaysOn Availability Groups. The migration sequence presented here is a best-practice approach that preserves the high availability and disaster recovery capabilities of your databases during most of the migration process.
MICROSTRATEGY 9.4 VS. TABLEAU 8.1
December 15,2014
Single Version of the Truth: MicroStrategy’s single metadata ensures consistent KPI definitions across the entire enterprise.
Multiple Versions of the Truth: Tableau deployments typically have a number of small metadata models (a.k.a. Tableau Data Source). The lack of a centralized metadata model results in creation of multiple disconnected analytical silos and promotes multiple versions of the truth. There is limited/no reusability of analytical definitions across the multiple Tableau metadata models.
SQL Server In-Memory OLTP Internals Overview
December 12, 2014
SQL Server was originally designed at a time when it could be assumed that main memory was very expensive, so data needed to reside on disk except when it was actually needed for processing. This assumption is no longer valid as memory prices have dropped enormously over the last 30 years