SigmaWay Blog

SigmaWay Blog tries to aggregate original and third party content for the site users. It caters to articles on Process Improvement, Lean Six Sigma, Analytics, Market Intelligence, Training ,IT Services and industries which SigmaWay caters to

a post-graduate student pursuing Masters in Economics from Madras School of Economics, Chennai.

Big Data transforming the Gaming Industry

Ever since the video gaming industry has entered the online space, this fast-paced industry is almost bursting. Bringing in about $20 billion of revenue a year in the U.S. market alone, this industry is now scaling new heights with vast data out-pours and advances in big data analytics. In the world of online and offline video games, joystick movements and every interactive step is a source of valuable data-points, which can deliver fascinating solutions for enhancing gamer experience and increasing revenue streams. The world of games is like any other form of entertainment-it requires addictive rewards to make the gamers come back again and again. According to reliable market reports, a large, video-game manufacturer has the potential to generate around 50 terabytes of data each day. In US, the gaming industry is certainly bigger than the movie industry in terms of revenue generation. Now that this prolific industry is rapidly embracing big data technologies, the industry is expected to make waves in novel methods of customer engagement, optimized and targeted advertising, and enhance the end-user experience.

Read more at:


Rate this blog entry:
3699 Hits

Addressing Cyber threats using Big Data

Malware and cyber-attacks have become increasing concerns among companies. Many of them struggle to predict and mitigate threats, which can spring and evolve quickly. Companies also have some concerns about the physical security of their infrastructure. The result is that they are in search of more comprehensive resources to address these challenges. Big data analytics, particularly when they are coupled with machine learning, represent a logical solution because it allows companies to consider multiple threat scenarios and determine the best response. 

In today’s complex network environments, Advanced Persistent Threats (APTs) and other cyber threats eradication may be accomplished by getting intelligence from data providers. 

For these cyber threats, appliances should be monitoring threat feeds from trusted providers for indicators of compromise (IOCs), including big data feeds like domain name systems (DNS) feeds, command and control (C2) feeds, and black/white lists, in order to correlate and hunt threats in a data set and recommends six steps to combat the potential threats.

Read the article at:


Rate this blog entry:
3109 Hits

Big Data Opportunities for Travel Companies

Big data has been big news in the travel industry for a few years now. It offers the chance for travel companies to increase the number of sales and improve the travel experience of customers. While big data is a vast, complex challenge for many organizations, it is one of the key factors driving the evolution of the travel industry today and for the foreseeable future. Airlines, hotels, cruise companies, travel management, railways and travel agencies have an opportunity to improve their business and the customer experience by effectively handling the big data at their fingertips. That's not to say it is easy to collect, identify and analyze all of the bits of disparate data types that comprise big data. Travel companies can make better decisions based on the data aggregated from their customers to personalize services like travel booking to make them easier to navigate. For many years, CRM systems have allowed companies to create relevant, targeted marketing campaigns. Big data, however, uses customer information to create a truly personalized service. It allows travel websites to recommend a specific hotel to a specific user based on their previous holidays, requirements and preferences.

Read more at:

Rate this blog entry:
3187 Hits

Fighting crime with Big Data Weapons

Big data analytics is increasingly playing a role in the fight against crime. Publicly shared information combined with data from local authorities, social services and intelligence gathered by beat officers is helping police forces around the world spot trouble before it starts. It helps the police be much less reactive, and slowly starts to reveal the real trouble spots and troublemakers in a neighborhood, estate or street. When information like that becomes clear, the police can do something about it long before anyone dials 999. And that counts for people as much as it does for pubs or clubs. Law enforcement is finding new ways to use technology and big data against crime. CCTV cameras are no longer impotent. They are commonly used in police cars and carried by officers to create a permanent digital record of everything going on around them. 

 This will make it harder for criminals to commit crimes. In recent years we have witnessed criminality moving off the streets with a huge increase in the amount of credit card and online identity fraud. But even there new big data algorithms are being developed to detect fraudulent behaviors in real time.

Read more at:


Rate this blog entry:
2923 Hits

Impact of the Internet of Things and Real time Analytics

Big data is a key infrastructure in the Internet of Things (IoT), but it's far from the only piece of the fabric. In the coming global order, every element of the natural world, and even every physical person can conceivably be networked. Everything will be capable of being instrumented. If you think that the world of driverless cars, robots carrying out maintenance in hazardous locations like oilrigs, or advertising that reads and responds to individuals' unique facial expressions sound like science fiction. As these trends come to fruition, each of us will evolve into a walking, talking, living beneficiary of the Internet of Things. These are all developments happening today and they're prompting a new exciting phase in analytics that needs to be addressed now. Those that embrace data will be more likely to be surfing on top of the wave of creative destruction, instead of having it crash down on top of them.

Read more at:

Rate this blog entry:
3284 Hits

Over-relying on Data

Data is more than just power. Organisations across all industry verticals are upgrading their data management systems, investing in new resources, and using their rich databases to streamline the practices of their departments. Indeed the message from experts is clear: organisations that fail to adapt and evolve to meet the emergence of big data, face the prospect of falling behind. As with any phenomenon, however, there are lessons to be learnt. 

The way data is being used in sports is a poignant example. Moneyball, the popular book inspired by Oakland Athletics manager Billy Beane, explained the core philosophy of the manager’s vision for the baseball team: using statistical analysis to maximise player acquisition and performance with a low budget. 

The Moneyball philosophy had huge ramifications for the sporting world. People started adopting variants of it in all sports – from soccer to basketball to football. Arguably the most noticeable application of Beane’s philosophy was by Andy Flower, the former England cricket coach. Flowers was known for his admiration of Beane’s work, and he too would use statistical analysis to not only determine who would be on the field but also what decisions players should make once they were selected and enjoyed notable victories also. Both of them have stood by data analytics and the benefits it can bring. Yet, what is often untold is that data was both a virtue and a vice for both men.

His 5-0 defeat at the Ashes last year was one of England’s most disappointing performances to date. As commentators suggested, it was a classic case of overreliance on data, replacing intuition with numbers, and allowing data to dictate rather than inform. Flower ultimately got the balance between trusting people and numbers wrong. He was in good company, those who thrive will not be those who use data most—but those who use it most smartly. But data is emphatically not a substitute for intuition and flair - either in the office or on the cricket field.

These instances of sports analytics are particularly relevant for organisations looking to add big data analytics to their existing operations. The example of Beane and Flower show how data does not have all the answers, and relying too heavily on it can have devastating effects.

Read more at:

Rate this blog entry:
2816 Hits

How big data will change the Fashion Industry

Big Data Analytics is all about turning volume and variety of data into meaningful insights. When data is refined and combined, new patterns and ideas emerge, and one can take better decisions using these insights. Big Data is being used across almost all the sectors these days. In the Online Apparel Industry where success of next season's collection hinges on selecting the accurate designs, colors, fabrics, shapes, and sizes, Big Data can be a big game changer. Online apparel industry is mostly influenced by predictions that are based on identifying the most popular/liked parametric values (colors, fabric, style and many more) of the apparels. If you predict it right, it may bring a profitable season for you, else it may lead to heaps of discarded inventories. For many years, analysts and fashion reporters have tried to control these drifts. It is a great advantage to recognize customer preferences that will lead into high prospect ratio.

One of the ways to understand customers’ emotions behind the interactions made on social media sites and other forums is sentiment analysis. Sentiment analysis scans tweets, comments, likes, etc., for evidence of positive, negative, or even indifferent impressions to identify the overall trend of sentiments towards any entity. For example, possible positive expression on a personal level would ideally be like – “I like to wear plain cotton clothes in summers…” to an opinion projected in general - “I am looking for some cool blue apparels for my next vacations as it feels comfortable.”

Similarly, negative expressions may go like – “I am fed up of seeing bright yellow apparels all around in summers.” Or an opinion in negative tone may be like - “People look damn horrible in yellow.” A range of tools and methods are available to help determine customers’ sentiments; one of the ways to track is using the Twitter Sentiment APIs.

Big data is also extremely useful in a marketing capacity, using information like customer demographics and spending habits, in terms of how much they spend, on what and where. In addition to these habits, companies that invest in cloud computing studies can monitor how their existing marketing strategies are working - eye scanning data can be analyzed to see the effectiveness of billboards and other visual advertising. Every aspect of the business will change, from what color will be in next season to how to make clothing that fits different body types and how to optimize  supply chains.

Read more at:


Rate this blog entry:
3112 Hits

Lessons from Big Data That Apply To Real Estate

Big data is the basis for business intelligence, which is about taking all that information and turning it into knowledge to drive better business decisions. Whether its data about retail consumers or homebuyers, it's all the same game.  The business intelligence industry has been analyzing large data sets in corporations for years — decades, really. It’s only now coming to the real estate industry. The amount of data used in the real estate industry isn’t that large. A single major retailer will generate more sales data in a year than the entire real estate industry will in a decade. However, it’s all relative, and the real estate industry is still trying to figure out what data it has, let alone how to use it.

The point is that big data in real estate is about presenting a “whole consumer” picture. It’s about using data to find out who buys what, when, where, why and how. It’s about finding out who will sell a house — when, where, why and how. 

All that data can be used to create tangible insights into consumer behavior using forecasting and modelling software. It’s the analysis that makes the magic happen, that is identifying customers or providing them better services. Analytics is where raw data and the algorithms that crunch it come together. Mining census information, the results of consumer surveys, listings of homes for sale and rent, geographic information systems data and more combine what they draw from numerous databanks with their own proprietary user-generated content. The tools can deliver to consumer’s information about their property's potential value and help them understand home-value trends within a particular milieu, such as a neighborhood or a ZIP code. 

Beyond the consumer and industry-facing aspects of big data, institutions such as banks can plug into big data resources to determine whether a foreclosure or short sale is really worth what a buyer or investor might be offering.

For now, the analysis of big data is likely to stay with those who gather it and companies willing to pay for access, such as the lead generation companies. What real estate agents need to know now is that the data is there and it’s available, in some form or another, to those who are willing to use the right tools.  Read more at:


Rate this blog entry:
3206 Hits

Ford accelerates through Big Data

Big data has the automobile in its sights and the results will be good for both the vehicle and its owner. In the coming years we can expect to see both safer vehicles and car-to-car communications. You'll be advised of a needed repair before a problem and recall notices will be delivered through the car. Ford gathers data from over four million cars with in-car sensors and remote application management software. All data is analyzed in real-time giving engineers valuable information to notice and solve issues in real-time, know how the car responds in different road and weather conditions and any other forces that could affect the car. Ford is also installing numerous sensors in their cars to monitor behavior. They install over 74 sensors in cars including sonar, cameras, radar, accelerometers, temperature sensors and rain sensors. As a result, their Energi line of plug-in hybrid cars generate over 25 gigabytes of data every hour. This data is returned back to the factory for real-time analysis and returned to the driver via a mobile app. The cars in its testing facility even generate up to 250 gigabytes of data per hour from smart cameras and sensors. 

Big data is also used to find out how people wanted their cars to be improved. Nowadays, Ford listens carefully to what their customers are saying online, on social networks or in the blogosphere, and performs sentiment analysis on all sort of content online and uses Google Trends to predict future sales.

Internally, Ford uses big data to optimize its supply chain and to increase its operational efficiency. From the parts before they reach the Ford factory, to the car waiting in the dealer for a customer, big data has infiltrated every part of the supply chain, creating large amounts of data. With so many different parts coming from so many different suppliers, it is vital for Ford to get a complete and detailed overview of all parts within the supply chain at any moment in time. To read more visit:


Rate this blog entry:
2817 Hits

Big Data and privacy concerns

In the era of Big Data, the fight for protection has as of now been battled and lost. The personal data is routinely gathered and exchanged and there are few powerful controls over how it is utilized or secured. Data scientists and analysts are now saying that now is the right time for enactment to recover some of that protection and guarantee that any information that is gathered remains secure.

We have become the product and are being productised and sold to anyone. We’re being monetised and mobilized as products with inducement of the services of we use such as Facebook and Twitter. The dilemma that the regulators are facing is how they can regulate the collection, storage and trading of personal data on the on the internet, when all of these activities, and the corporations themselves, operate across multiple continents and jurisdictions.

The task of reclaiming some semblance of privacy is all the more urgent because the rate at which personal data is being collected is accelerating. The buzz around big data is attracting millions of dollars of from investors and brands hoping to turn a profit, while intelligence agencies are also furiously collecting information about our online activities for much different purposes.

And alongside these, there’s also the black market operators that make millions of dollars a year out of things like identity theft and matching disparate data sets across the web to help identify people who might be suitable targets for a scam. 

New privacy principles were recently passed into law which required all businesses earning more than $3m annually to disclose to customers how their information was being stored and used, however the new legislation stopped short of mandating compulsory data breach notifications for businesses who fall victim to security violations.

A bill that would make it illegal to hide security problems was set to pass into law last year, however it failed to make it through both houses of the Senate before the election. And since the Coalition took power, the legislation has stalled. 

Still, there are many privacy challenges ahead, and the problems have by no means been solved. Most methods of anonymizing do not scale well as p or n get large. Either they add so much noise that new analyses become nearly impossible or they weaken the privacy guarantee. Network-like data pose a special challenge for privacy because so much of the information has to do with relationships between individuals. In summary, there appears to be “no free lunch” in the trade-off between privacy and information. To read more:

Rate this blog entry:
3144 Hits

How license plate databases track your every move

License plate scanning technology has been around for decades - the British police originally adopted it in the 1970s to track the Irish Republican Army members, but it only came into wide use in the last decade as cheaper but highly effective models became available. These scanners use high-speed cameras and optical character recognition technology to capture up to 1,800 plates per minute, even at high rates of speed and in difficult driving conditions. The scanner also records the date, time, and GPS location of each scan. The explosive growth of license plate readers and large police-compiled databases that store information about  for arbitrary periods of time probably going up to indefinitely period of time. One of the problems with this practice is that different states have different policies on how such data can be used and shared — some states put strict controls on the use of this information, while others have what amounts to an open door policy on driver information.

License plate readers have become vastly more popular in recent years thanks to falling prices, federal funds, and an aggressive marketing campaign from device manufacturers. In theory, they’re a great way to find stolen property, track fleeing criminals, or keep an eye on felons with a high risk of re-offense. At present, however, there are virtually no limits on data retention, usage, or who has access to the information. As it is getting become more popular, an increasing number of police departments are deploying them on patrol cars as well as at fixed locations.

A license plate can be tagged to a particular vehicle, registered to a specific person. There are two problems with this argument. First, the police aren’t just using these readers to track known criminals — they’re building associative databases of people who have never been charged with any crime on the grounds that such information might be useful in the future. Not only does this have a known chilling effect on people’s actions, it opens the door to profiling groups of people based on the erroneous belief that doing so will help identify future criminals. 

As things stand right now, most of these databases are open to anyone who wants a look at them. Sure, your boss can’t technically fire you for your political affiliation, but he can check and see where your car was when Obama came to town last time, or whether it was picked up outside a polling station on the day of election results. Then, come next performance review, you’re out of a job with no idea why. To read more visit:

Rate this blog entry:
2644 Hits

Big Data on Organ Transplant Market

With more than 120,000 people in need of organ transplants and a shortage of donors, economists, doctors and mathematicians are using data to save lives. On a very basic level, the organ transplant process can be separated into two categories: organs taken from living donors and organs harvested from deceased donors. From living donors, doctors can take one of a person's two kidneys, as well as part of his or her liver. From a deceased donor, doctors are able to extract a cadaver's kidneys, liver, heart, lungs, pancreas, intestines and thymus. Of the organs donated in 2013, roughly 80% came from deceased donors, according to UNOS. While it's preferable to receive a kidney from a living donor, the donors and candidates are incompatible in approximately one-third of potential kidney transplants because of mismatched blood or tissue types. In the case of incompatibility, a candidate is placed on what's commonly referred to by the public as a "waiting list".  UNOS receives information from both the candidate and the deceased donor to establish compatibility such as blood type, body size and thoracic organs, like the heart and lungs, need to be transplanted into a similarly-sized recipient and geography as it seeks to match candidates locally, regionally and then nationally. With that data, UNOS' algorithm rules out the incompatible. It then ranks the remainder based on urgency and geography. For example, a liver made available in Ohio would theoretically go to the closest compatible candidate with the highest MELD score. 

In 2010, UNOS launched its Kidney Paired Donation Program that used Sandholm and his team's algorithm. So far, the program has matches have resulted in 97 transplants, with more than a dozen scheduled in the coming months. To read in detail visit:

Rate this blog entry:
2906 Hits

Improving Retail performance with Locational Analytics

Traditionally business have relied on graphs and charts to analyse crucial information. But these basic visualizations have a propensity to miss two of the most important aspects of a retailer’s data — where things are located and what is happening around them. Imagine being able to better understand where customers live, what they buy, what they do and why they do. Location analytics is a game changer. It helps organisations see where data is, not just what it is. Location analytics brings together dynamic, interactive mapping; sophisticated spatial analytics; and rich, complementary data to enhance the overall picture of business operations. Best of all, it is available from within already-established analytics software, so there is no need to say goodbye to familiar business tools or workflows.

The combined solution joins key business intelligence (BI) data with spatial location, resulting in improved store performance driven by better marketing decisions. It covers all stores operated by the group to guide expansion and development strategy, optimize direct marketing actions such as distribution of weekly circulars, monitor store performance, and gain a better understanding of the sales territory. Moreover, it helps in viewing and analysing data, including traditional retail information such as trade and mailing areas, competition analysis, customer locations, and advertising hoardings. Geographic data used includes Bing Maps, Nokia data, and aerial and satellite images. A BI map service’s bi-directional link provides a unique and dynamic integration solution between the mapping and BI systems.

The geo-marketing application is used for many strategic activities such as guiding expansion and development strategy of the company and optimizing direct-marketing actions including distributing weekly circulars store performance can be monitored and a better understanding of territories can be provided. All this information feeds one database and can be shared across the enterprise. Location analytics is enabling a refined and deeper understanding of how to improve marketing and other store-level operations. It enriches data for a more intimate understanding of customer relationships, behavior, and need. See more at:



Rate this blog entry:
3428 Hits

How casinos are betting on big data

Billions of dollars are lost by gamblers every year along the Vegas Strip, but some casino operators are taking strides to soften the blow of serious gambling losses and leveraging big data to keep customers coming back, according to one executive.  "They could win a lot or they lose a lot or they could have something in the middle. So we do try to make sure that people don't have really unfortunate visits," said Caesars Entertainment Chairman and CEO Gary Loveman on Big Data Download.  Caesars and other casino operators offer loyalty programs. As gamblers spend, companies gather data on those spending trends. Customers also receive tailored incentives for gambling and spending. 

"We give you very tangible and immediate benefits for doing so. So we give you meals, and hotel rooms and limousines and show tickets. You share with us information on what you've been doing, what sorts of transactions you've made," said Loveman, whose company is the biggest U.S. casino operator.

Caesars in particular employs about 200 data experts at its Flamingo Hotel alone. They scour through data on the types of games customers have played, what hotel they've stayed at and where they've been dining. So the next time when you visit a casino, expect a suddenly friendlier slot machine after you are on a losing streak.

Read the complete report here:

Rate this blog entry:
3114 Hits

How to Measure Social Media ROI

Social media now holds a place alongside print and broadcast as a major, essential marketing channel for businesses. As such, social media now should be held to the same standard as those channels: your social media ROI needs to contribute to your bottom line. To prove that your social media investment is truly warranted, you need to track how social is influencing every interaction you have with your clients.

The first step involves setting social media goals that complement existing business and departmental goals. If you have set a specific number of leads you’re trying to attain this quarter, set the number of leads you want to specifically be driven by social media. If one of your goals is to increase landing page conversion by say 10%, ensure that you’re tracking the conversion rate of people who land on the page through social channels. Audit your existing social media performance to establish baseline targets, and then set appropriate goals for improvement.

Once you’ve established your social media goals, you’ll need to identify and implement the tools and processes required to measure the ROI on your social media. This may involve adding tracking codes to URLs, building custom landing pages, and more. There are a variety of social media analytics tools which service to track the diverse metrics you are after.

Once you’ve identified what works and what doesn’t work on social, it’s time to adjust your strategy. The point of tracking your social media ROI isn’t just to prove your social campaigns are valuable, it’s to increase their value over time.

Due to the short lifecycle of social media campaigns, a failing campaign should be changed and improved as soon as possible. Social media is never static. To meet your social media ROI goals, you’ll need to constantly update and adapt your strategy taking into account the analytics data you’re tracking. To read the full article visit:

Rate this blog entry:
3359 Hits

The tools of Predictive Analytics to improve your CRM

While CRM applications officially gather terabytes of helpful client data for organizations, significant deeper insights are also en route because of a creating new pattern of predictive analytics capabilities being integrated into CRM. The huge draw is that organizations will have the capacity to utilize existing CRM information to tremendously enhance basic one-on-one associations with clients. An alternate key profit is that it will help organizations create extra deals when clients reach them by breaking down approaching client information progressively. 

It's the same thought with CRM that incorporates add-on or implicit predictive analytics when a potential client arrives at your company's Web webpage to make a purchase. In the event that a client is offered this item at this cost at this point, would they say they are likely to purchase it? One can make a targeted offer to a client focused around what they are looking for. The probability that they acknowledge that offer will figure out whether you can augment client maintenance, deals and benefits. 

As these sorts of predictive analytics features are presented, organizations will need to evaluate their methodologies to joining the right parts into their own particular foundations. That will take research, detailed inquiries and discussions with teams from marketing, IT and other departments, as well as market research and more. It's not something one will be able to jump into with little thought. One ought to know his objectives before he make the first strides so he can attain sufficient payback from his investments of time and resources. To read the full article visit:


Rate this blog entry:
2577 Hits

How does the product recommendation feature work?

Most shopping websites, regardless as to whether they are auction based like eBay or are just one sprawling marketplace like Amazon, tend to prominently feature a list of recommended products on their homepages. These lists are the results of their product recommendation engine.

They work by taking into user preferences or your preferences for that matter and then correlate it with the products and services available on the site. Needless to say, product recommendation engines naturally enjoy access to the entire product and service database of the website. Information is filtered with your preferences in mind and, afterwards, the product search engine comes up with a list of products and services that it considers likely to appeal to you.

Predictions made by product recommendation engines are not only based on the description of the product and service but also on whatever information it can obtain from your own social environment and previous web history. It first gains access to a pool of users and collects data based on their behavior online, their activities, and their preferences. All the information collected is then filtered and submitted to a platform which categorizes them into products that a group of users may like or dislike. When you visit the site, the first thing it will do is to determine which group of users you belong to. From there, it will provide recommendations on the assumption that your tastes are similar to users it had studied in the past. To read more:

Rate this blog entry:
2632 Hits

Predictive Analytics a boon to the financial market

Risk analytics is increasingly important for banks as they cope with a complex regulatory and competitive environment. Important technologies and calculation engines are now available that are critically important to the future of banks and the entire industry. At the same time, it is possible to develop an over-reliance on analytics, so a balance needs to be found.

Developing more comprehensive and integrated capabilities is increasingly important. Integrated stress-testing, for example, is an important means by which the science of risk management can be turned into more of an art, such that it can be communicated and appreciated by a wider audience. An effective stress-testing framework encompasses a wider spectrum of macro-economic, social, political and environmental considerations and forecasts and so can help banks avoid the tunnel vision that can prevent them from making good decisions and taking timely action.

Companies are investing in risk analytics and intend to increase those investments, yet the potential return is often stifled by inconsistent or incomplete data. This prevents organizations from generating the insights needed to support a more predictive approach to risk management. To read more:

Rate this blog entry:
3435 Hits

Impact of the fusion of Business and Market Intelligence

Business intelligence (BI) is the larger term that depicts the accumulation and investigation of an association's own data, including sales data plus legacy documents. It addresses the inquiry of whether an organization has all the essential assets and techniques to work effectively in a specific business sector. This brainpower is regularly used to control expenses, comprehend operations and execution, and also build profit and adequacy. 

Market intelligence (MI) serves to enhance choice making. The distinction is that BI concentrates on an organization's own particular information and MI concentrates on outside data. MI provides for you a reasonable picture of business sector open doors, dangers, client necessities and the focused scene. Examining this data will help you choose how you can develop the business, addition piece of the overall industry, dispatch new items or enter new markets. 

A blend of BI and MI can reveal to you whether your inner assets are ideally adjusted to outside business sector potential. Using both MI and BI can give effective experiences – yet the test is that business knowledge and business sector sagacity information come in distinctive structures and configurations. This influences how the accessible information is gotten to, consolidated, dissected and utilized. 

All these mirror the difficulties with Big Data, implying that we frequently find our information sets are excessively expansive and intricate to control or research with standard strategies or device. To read more:


Rate this blog entry:
3073 Hits

GIS Technology to tackle the PDS system loopholes

A Geographic Information System (GIS) integrates hardware, software, and data for capturing, managing, analyzing, and displaying all forms of geographically referenced information. GIS allows us to view, understand, question, interpret, and visualize data in many ways that reveal relationships, patterns, and trends in the form of maps, globes, reports, and charts. It helps you answer questions and solve problems by looking at your data in a way that is quickly understood and easily shared. There are a lot of advantages of using GIS technology. It makes data handling easier, covers large area, used for monitoring various things because of repetitive coverage, it is fast, can be used in inaccessible areas, unbiased, more accurate, reliable and economical. Data could be collected in several bands/ colors so it could be used for micro level analysis. GIS will help in decision making by government officials and in increasing transparency and accountability for good governance. It will help in management of natural resources, improved allocation of resources and planning, improved communication during crisis, cost savings by improved decision makings etc.

The effective use and implementation of Radio-frequency identification (RFID), GPS & data mining techniques in Public Distribution System (PDS) can facilitate PDS supply chain and promise eradicating mismanagement, corruption, trafficking, theft and anti-social elements. RFID provides highly accurate and detailed information by capturing the data and information at each stage of the supply chain, automatically. It also improves the safety and efficiency of the food supply chain. Location technology GPS can also be combined with RFID technology to automatically track and record the information regarding the field where the produce was picked, when and where it was transported and the current location of the produce. This also helps in reducing theft and trafficking. Data mining techniques based on the rule base classification model is used to identify the suspicious moving behavior of the objects. To read more:


Rate this blog entry:
3383 Hits
Sign up for our newsletter

Follow us