Visual Analysis as the Solution to the Big Data Problem

Document Type:Research Paper

Subject Area:Computer Science

Document 1

The mechanisms are used to discriminate data, classify or identify necessary information from the unnecessary ones, besides retrieval of important information even as the company discards the rest of information considered to be useless. The process of visualization therefore operates as a semi-automated technique for analysis. The use therefore has some influence in the manner in which the process is done because of the fact that human input is required in the process of doing the analysis. Visual analysis is integrates human command with the works of the computer hence the system analyses information in manner that takes care of the interest of the person issuing the commands to the computer. In addition, the computer system enables the individual to concentrate on the task as it does the visual representation, sketching, and data presentation in a manner that can be used by the researcher or the analysis.

Sign up to view the full document!

As a consequence, many institutions face the challenge of backlog of data to be analyzed. Perhaps the major source of the big data problem is that there is continuous increase in storage capacity of devices and enhancement of the cloud storage techniques. Previous researches have come up with some mechanisms through which the problem of volume may be considered and handled (Heer & Shneiderman, 2012). The problem would be best addressed through various approaches including increased processing power, artificial intelligence, data reduction, data mining among other affective methods of handling the problem of large volumes in digital data handling and management (Keim et al. With too much data to be analyzed, the problem causes delays in forensic data analysis. In the present case, a hypothesis would be generated concerning the amount of time needed to do something; such time frame can be ascertained, proved or disproved through visual analysis.

Sign up to view the full document!

In that regard, visual analysis would be used to make a decision concerning the role of time in that process thereby confirming the hypothesis or contradicting it. In circumstances when a probability is supposed to be determined, time analysis is used to determine the nature of probability having regard to the temporal factors. Temporal factors influencing whether something happened or not are some of the determining factors in decision making when the subject of the discussion concerns time factors. Time analysis is therefore applied to refute or confirm probability hypothesis. The implication of the same is that the data analyst discriminates the information by isolating them into ranges (Keim et al. Filtering would enable a data analyst to draw a range, generate probability that what it being looked for fall within that range and make conclusions based on the small section derived out of the large pool of data.

Sign up to view the full document!

With filtering, probability hypothesis can easily be confirmed or refuted (Shen & Eliassi-Rad, 2006). If the required parameter is temporal, data would be isolated through indicating a time frame. In the same manner, data can be isolated through the connections they have in relation to a particular company or entity (Steed et al, 2013). Evaluate Visual Analysis Tools There are many visual analysis tools in the market. Some of them are tested and proven to be efficient and proper in delivering the desired results concerning data analysis. After data has been presented in a spreadsheet, the next step would be to apply data analysis tools to visualize the data and make inferences (Shen & Eliassi-Rad, 2006). The list visual analysis tools in the market is long; some of the tools forming the list include Maltego, tableau, infogram, chartblocks, datawrapper, plotly, RAW, visual.

Sign up to view the full document!

ly, D3. In addition, the tool generates a server report which can be viewed online or through the app. For the companies that prefer a cloud-based solution to their big data problem, tableau offers a cloud based service for data analysis besides the manual set up. Infogram Infogram is a big data visualization tool that assists analysts to link their info graphics to a company or an entity’s big data (Shen & Eliassi-Rad, 2006). The tool uses a triple step method to perform the task. Besides, it permits the use of templates in the platform (Heer & Shneiderman, 2012). For purposes of cost-benefit analysis, it is appropriate to consider the price of the recommended tools against the benefits (Heer & Shneiderman, 2012). The commercial price for infogram is $19 per month, per user.

Sign up to view the full document!

From $10 to earn access

Only on Studyloop

Original template