/home/leansigm/public_html/components/com_easyblog/services

SigmaWay Blog

SigmaWay Blog tries to aggregate original and third party content for the site users. It caters to articles on Process Improvement, Lean Six Sigma, Analytics, Market Intelligence, Training ,IT Services and industries which SigmaWay caters to

a post-graduate student pursuing Masters in Economics from Madras School of Economics, Chennai.

Big Data in good governance

Intelligent processing of data is essential for addressing societal challenges. Data could be used to enhance the sustainability of national health care systems and to tackle environmental problems by, for example, processing energy consumption patterns to improve energy efficiency or of pollution data in traffic management. 

To make the best use of big data now and in the future, government must have the right infrastructure in place. It advocates a data force, based on the successful nudge unit, to access data from different departments and identify where savings could be made. New data science skills will be needed across government, but it is more important is ensuring that public services leaders are confident in combining big data with sound judgment.

Closely linked to government's drive to make better use of big data, is its drive to make data open. Open data, particularly that in the public sector, is often big data – for example, the census, information about healthcare or the weather – and making large government datasets open to the public drives innovation from within government and outside. See more at: http://fcw.com/Articles/2013/09/25/big-data-transform-government.aspx?Page=1

 

Rate this blog entry:
5423 Hits
0 Comments

Big Data : Key to better pricing

In the recent past, most companies have recognized the bottom-line impact to be gained through effective pricing. Tapping the full promise of pricing requires an infrastructure to drive real and sustained pricing performance. With such a foundation, a company can establish and strengthen pricing activities by creating deliberate decision processes, a specialized pricing organization, mechanisms that appropriately measure and reward pricing excellence, and vigorous support tools and systems.

A pricing infrastructure can be difficult and costly to create. It requires investing appropriately, empowering the right people, articulating clear targets and goals, and managing risk. Yet the benefits of realizing true pricing excellence are worthwhile: a one-percentage-point improvement in average price of goods and services leads to an 8.7 percent increase in operating profits for the typical Global 1200 company. 

Every company should have a set of pricing metrics that measure the financial and operational health of pricing across the business. These metrics may include simple data, such as the average selling price, discount, and margin for key products; operational data, including the number of pricing exceptions and win/loss percentages; and special measures to track the progress and impact of specific pricing initiatives. While the manager of a single product line may see metrics only for that, the general manager of a business unit sees those same metrics across the operation and can drill down to the level of individual products to understand the root causes of pricing performance.

Without uncovering and acting on the opportunities big data presents, many companies are leaving millions of dollars of profit on the table. The secret to increasing profit margins is to harness big data to find the best price at the product—not category—level, rather than drown in the numbers flood. To read more visit: http://www.mckinsey.com/insights/marketing_sales/using_big_data_to_make_better_pricing_decisions

 

Rate this blog entry:
5337 Hits
0 Comments

Now Big Data to change the way you hire, fire and promote

Enterprises hire  lots of people, but in a world where change happens fast and often, they can’t anticipate every need. One solution is to hire contractors for new or temporary projects. But that involves recruiters finding people—but they won’t already know the company’s systems and culture. A better way is to find someone in-house, but in a company with hundreds of employees, that can be difficult—unless you let big data do the heavy lifting.

Progressively, organizations are doing simply that: Big Data is helping them match positions to existing representatives' profiles. Organizations are utilizing HR programs that change the representative profile from ignored comfort into an effective instrument giving them a chance to discover abilities that don't match a specialist's set of expectations or even their delineation toward oneself. They work by scouring social-media profiles, forums, blogs and comments across the Internet, to unearth talent that’s under their own roof—but they just didn’t know it. For a detailed article on this topic visit: http://www.theatlantic.com/magazine/archive/2013/12/theyre-watching-you-at-work/354681/.

Rate this blog entry:
Recent comment in this post
Nitin Sinha
Thanks For Your valuable posting, it was very informative. Am working inErp Development Company In India
Tuesday, 18 August 2015 06:35
10061 Hits
1 Comment

Big Data in Healthcare Analytics: Its potential

The healthcare industry historically has generated large amounts of data, driven by record keeping, compliance & regulatory requirements, and patient care. While most data is stored in hard copy form, the current trend is toward rapid digitization of these large amounts of data. Driven by mandatory requirements and the potential to improve the quality of healthcare delivery meanwhile reducing the costs, these big data hold the promise of supporting a wide range of medical and healthcare functions, including among others clinical decision support, disease surveillance, and population health management. Big data in healthcare refers to electronic health data sets so large and complex that they are difficult to manage with traditional software or hardware; nor can they be easily managed with traditional or common data management tools and methods. Big data in healthcare is overwhelming not only because of its volume but also because of the diversity of data types and the speed at which it must be managed. The totality of data related to patient healthcare and well-being make up “big data” in the healthcare industry. To read more about the advancement of big data in Healthcare visit: http://www.mckinsey.com/insights/health_systems_and_services/the_big-data_revolution_in_us_health_care.

Rate this blog entry:
Recent comment in this post
Sayak Dutta
Even pharma companies can adopt big data analytics for making business decisions and achieving desired goals.They can get a cleare... Read More
Thursday, 19 June 2014 06:51
5860 Hits
1 Comment

How bad data can be misleading

Big Data does not necessarily mean good data. And that, as increasing number of experts are saying more insistently, Big Data does not automatically yield good analytics. As everyone realizes, bad data equates to bad intelligence, which equates to bad decision-making and thus equates to bad things happening in your business. If the data is incomplete, out of context or otherwise contaminated, it can lead to decisions that could undermine the competitiveness of an enterprise or damage the personal lives of individuals. So how do we detect that? Firstly it's important to understand where your data originates. Has it been captured by your own work force? What measures were put in place to ensure that the very best job has been done and that the data being captured lives up to expectations? What are the requirements for the data your business needs and uses daily? Do you enhance your data from other sources (external or internal)? An example of how out of context data can lead to distorted conclusions comes from Harvard University professor Gary King, director of the Institute for Quantitative Social Science who was attempting to use Twitter feeds and other social media posts to predict the U.S. unemployment rate, by monitoring key words like "jobs," "unemployment," and "classifieds." To read more about the episode visit: http://www.infoworld.com/d/business-intelligence/big-data-without-good-analytics-can-lead-bad-decisions-225608.

Rate this blog entry:
5174 Hits
0 Comments

Embedded Analytics in customer experience

Associations are always searching for approaches to help their clients. Better administration, modified items, and value certifications are simply a few ways that associations attempt to guarantee client dependability. For administration and information suppliers, be that as it may, it is not generally simple to give included esteem past the administration or information gave. The guarantee of more information and better perceive help guarantee client fulfilment by giving them the tools they have to addition included experiences. Non-profit, educational, and government organizations are illustrations of commercial ventures that have posted demographics or other explanatory information online for open utilization. This means associations create provisions that are inserted inside their answers that could be given as an administration to clients. This right to gain entrance to investigation assists furnish clients with more extensive experiences into their records, clients, patterns, reporting needs, and so on. Generally, associations incorporate these additional reporting or dismemberment capacities as an additional to the organizations or information successfully gave. To read more visit: http://midsizeinsider.com/en-us/article/using-embedded-analytics-enhanced-customer-experie#.U5_Dw_mSxvM 

Rate this blog entry:
5831 Hits
0 Comments

An overview of Text Mining

Text mining, which is sometimes referred to "text analytics", is one way to make qualitative or "unstructured" data usable by a computer. Also known as text data mining, refers to the process of deriving high-quality information from text. High-quality information is typically derived through the devising of patterns and trends through means such as statistical pattern learning. Text mining usually involves the process of structuring the input text, deriving patterns within the structured data, and finally evaluation and interpretation of the output. 'High quality' in text mining usually refers to some combination of relevance, novelty, and interestingness. Typical text mining tasks include text categorization, text clustering, concept/entity extraction, sentiment analysis etc. Text analysis involves information retrieval, analysis to study word frequency distributions, pattern recognition, tagging, information extraction, data mining techniques including link and association analysis, visualization, and predictive analytics. The main goal is, essentially, to turn text into data for analysis, via application of natural language processing (NLP) and analytical methods. To read more about text mining: http://www.scientificcomputing.com/blogs/2014/01/text-mining-next-data-frontier.

Rate this blog entry:
5310 Hits
0 Comments

Extracting insights from mobile data

Mobile phones serve a dual purpose in the context of Big Data. Each mobile phone, non-smart phones inclusive, creates numerous types of data every day. These include call detail records, SMS data, and geo-location data. In case of smartphones, such devices also generate log data via the use of mobile applications, financial transaction data associated with mobile banking and shopping, and social media data from updates to Facebook, Twitter and other social networks. The volume of portable information and the velocity at which it is made is just going to build as both the worldwide population and cell phone infiltration rates ascent, and the utilization of online networking expands. When investigated viably, this information can give knowledge on client opinion, conduct and even physical development designs. Because of the sheer number of cell phones being used, Big Data specialists can tap versatile Big Data examination to better see such patterns cross over large population and sub-portions of clients to enhance engagement strategies and improve the conveyance of administrations. It gets to be especially valuable for examination purposes when joined together with outside information sources, for example, climate information and investment information, which permit experts to relate macro-level patterns to focused on sub-portions of clients. To read more: http://wikibon.org/blog/the-dual-role-of-mobile-devices-for-big-data/

Rate this blog entry:
6384 Hits
0 Comments

Big Data meets weather forecasting

Big models and big data have long been a feature of weather and climate modelling. Computer-generated global weather forecasts are initialized from millions of diverse observations from satellites, weather balloons, surface weather stations, ships and buoys. Data assimilation, the procedure of ideally mixing these perceptions into the estimate model, is the most computationally difficult part of making a worldwide conjecture, and is a basic component of forecast skill. The international climate modelling community has evolved interesting infrastructure and social institutions that enable a diverse community of interested users to obtain standardized results from leading climate models developed around the world, to capture aspects of climate modelling certainty and uncertainty and help inform decision-makers and the interested public.

Past the thriving information administrations industry, weather has huge monetary and well-being ramifications. Weather Analytics, an organization that gives atmosphere information, evaluates that climate affects more than 33% of overall GDP, influencing the farming, tourism, angling, amusement, and air transport commercial enterprises, to name simply a few. Dubious climate conditions likewise affect little entrepreneurs. Moreover, public safety is of vital concern when officials aim to understand the impact of extreme weather events such as hurricanes, tsunamis, or wildfires. To know more about this aspect go through Per Nyberg (Senior Director of Business Development at Cray)’s article link: http://www.informationweek.com/big-data/big-data-analytics/3-ways-big-data-supercomputing-change-weather-forecasting/a/d-id/1269439

Rate this blog entry:
6987 Hits
0 Comments

Animal conservation using Big Data

At this point when individuals consider saving rare species, they consider remote jungles, researchers and individuals anchoring themselves to trees. The stereotyped thought is that creatures in the wild are extremely hard to track and that the main way that individuals do this is through a basic following framework with a little specimen making presumptions for the more extensive group. Big Data and the complexities of data analysis could not be further from this, with the collection of massive data sets combined with complex predictive models and algorithms creating insights. The idea that enough data could even be collected to make a useful analysis is hard to imagine.  However, this has changed as of late as HP have collaborated with Conservation International (CI) to make Earth Insights. This system has been intended to give an early cautioning framework for creature numbers amongst jeopardized species over the world. Through the utilization of cameras and atmosphere sensors, the framework can gather information from around 1000 of these gadgets and use it to group data on population numbers. To know more about this aspect go through Dan Worth (news editor of V3)’s article link:http://www.v3.co.uk/v3-uk/news/2318103/hp-big-data-tools-help-wildlife-charity-save-the-planet 

Rate this blog entry:
6128 Hits
0 Comments

Analytics to combat fraudsters

Fraudsters are more competent, better made, and creatively excellent than whatever possible time in the later past. Their adulteration arrangements include complex frameworks of individuals, records, and events. The evidence for these schemes may exist on multiple systems, incorporate various data sorts, and deliberately represent hidden activity. So an analyst has abundant investigative focuses on these frameworks with no true approach to join data or results. To prevent and uncover deception, one needs a solution that is more exceptional and advanced than hoaxers. A basic venture in fraud detection analytics is visualizing the patterns in your data between people, places, frameworks, and events. These data mining and profound analysis capabilities provide more context and better information, enabling more accurate data segmentation and data labelling, which further improves pattern recognition. To read more about it: http://www.21ct.com/solutions/fraud-detection-analytics/.

Rate this blog entry:
6272 Hits
0 Comments

Data Mining in Sports: A pragmatic of approaching the game

Professional sports organizations are multi-million dollar enterprises with millions of dollars spent on a single decision. With this amount of capital at stake, just one bad or misguided decision has the potential of setting an organization back by several years. With such a large amount of risk involved it requires a critical need to make good decisions, and hence it’s an attractive environment for data mining applications.

Sports Data Mining has experienced rapid growth in recent years. The task is not how to collect the data, but what data should be collected and how to make the best use of it. From players improving their game-time performance using video analysis techniques, to scouts using statistical analysis and projection techniques to identify what talent will provide the biggest impact, data mining is quickly becoming an integral part of the sports decision making landscape where managers and coaches using machine learning and simulation techniques can find optimal strategies for an entire upcoming season. By finding the right ways to make sense of data and turning it into actionable knowledge, sports organizations have the potential to secure a competitive advantage over their peers. To read more how it has been used: http://www.ukessays.com/essays/psychology/data-mining-in-sports-in-the-past-few-years-psychology-essay.php 

Rate this blog entry:
8934 Hits
0 Comments

The Evolution of Video Analytics

Video Analytics identifies events or patterns of behaviour through video analysis of monitored environments. It monitors video streams in real-time and automatically creates current security alerts and analyses historical data to identify specific incidents and patterns. It helps agencies to organize, analyse and share the insight gained from data to make smarter decisions and enable enhanced coordination.

Using a combination of algorithms, video analytics analyses captured video in real time and presents alerts about whatever the application is programmed to identify. Today's video analytics applications are able to do much more than just identify motion, and false alarms have been reduced to negligible rates as they can automatically filter out motion caused by wind, snow, rain and change of lighting. Some applications now also have the ability to detect tampering, and can automatically adjust the visual parameters of enabled video cameras according to individual scene characteristics to ensure optimal brightness and contrast for video viewing and recording. To read more about the evolution of video analytics read here: http://www.sourcesecurity.com/news/articles/co-2173-ga.2504.html 

Rate this blog entry:
6573 Hits
0 Comments

Quantifying Twitter sentiments

This article elaborates on the sentiment analysis from tweets using data mining techniques. Instead of using SQL, it shows how to conduct such analysis using a more sophisticated software called RapidMiner. It explains how one can extract Twitter data into Google Docs spread sheet and then transfer it into a local environment utilizing two different methods. The emphasis is on how to amass a decent pool of tweets in two different ways using a service called Zapier, Google Docs and a tool called GDocBackUpCMD, along with SSIS and a little bit of C#. Zapier is used to extract Twitter feeds into Google Docs spread sheet and then copy the data across to local environment to mine it for sentiment trends. Next, it is shown how this data can be analyzed for sentiments i.e. whether a concrete Twitter feed can be considered as negative or positive. For this purpose, RapidMiner as well as two separate data sets of already pre-relegated tweets for model learning and Microsoft SQL Server for some data polishing and storage engine. Read more at:http://bicortex.com/twitter-sentiment-analysis-mining-twitter-data-using-rapidminer-part-1/

Rate this blog entry:
7124 Hits
0 Comments
Sign up for our newsletter

Follow us