The rising importance of big-data computing stems from advances in many different technologies. Some of these include:
- Computer networks
- Data storage
- Cluster computer systems
- Cloud computing facilities
- Data analysis algorithms
How does these technologies play a role in global computing and big data?
Please make your initial post and two response posts substantive. A substantive post will do at least two of the following:
- Ask an interesting, thoughtful question pertaining to the topic
- Answer a question (in detail) posted by another student or the instructor
- Provide extensive additional information on the topic
- Explain, define, or analyze the topic in detail
- Share an applicable personal experience
- Provide an outside source (for example, an article from the UC Library) that applies to the topic, along with additional information about the topic or the source (please cite properly in APA)
- Make an argument concerning the topic.
Global computing and big data are the reason digital society is evolving and enhancing the performance of organizations. Both technologies are ‘n’ numbers of benefits that are provided to the organizations as big data allows the organization to identify the insights, to find out hidden patterns, new market trends, and also preference and choice of the customers. Computing techniques allow the organization to communicate throughout the globe using the tools, software, and innovative techniques of computing. Global computing can be said as considerably extremely distributed computation, and it is the effect of computing resources that are not used even after being connected to the internet. The primary idea of global computing is to harvest the idle time of the computers that are connected to the internet (Germain, & Fedak, 2002).
Big data and global computing play a very important role in many different technologies such as sensors, computer networks, data storage, cluster computing systems, etc. Big data and global computing allows these technologies to gather data, identify, store, and process it in order to make it meaningful. Big data needs a very large space to process each and every data present and developed until the present. It can be implemented on regular systems. Data analytics algorithms can be identified and analyzed when the system is idle. Cloud computing and big data is a deadly and perfect combination as big data deal with huge data whereas, cloud computing is all about infrastructure. These two technologies when combined together by any organization it will be very beneficial as they provide scalable and cost-effective results (Verma, 2018).
Germain, C., & Fedak, G. (2002, January). Global Computing Systems. https://www.researchgate.net/publication/2414045_Global_Computing_Systems
Verma, A. (2018, July 21). Big Data and Cloud Computing – A Perfect Combination. https://www.whizlabs.com/blog/big-data-and-cloud-computing/
A few years ago, sensor technology was seen as an extremely promising field, and new sensor product categories seemed to spring up overnight. However, sensor technology is only one factor in the overall picture. The other big factor in computing and big data, and it is sensor technology, is that big data is becoming increasingly capable. Computer networks provide a convenient method of connecting many different data sources to one common data store. Computer network logs and other system logs (Deutsch et al., 2020). These logs can be analyzed and correlated with other data in the system to learn more about issues in the system. For instance, email is often sent over the network, and data traffic such as files is routed over the network. A big data picture has emerged. As companies’ data, organizations and individuals grow, they are also expected to keep a significant amount of this data in memory for later use. Storage is a common medium for storing data and often includes non-volatile random-access memory such as flash memory (Deutsch et al., 2020).
The role of computer systems in global computing and big data is understudied. For years, the name of a computer cluster used to be considered taboo in computer science circles. However, computer security researchers and some academic practitioners have now come to terms with the concept of large-scale, data-rich computing (John Krogstie, 2017). At the very core of any information, the system is a cluster computer. Because clusters are so pervasive, it is important to understand how and why they are built. Cloud computing facilities can significantly reduce the time required to perform many traditional processing volumes in a relational database. IT practitioners can use a single, central computing environment to provide multiple, standalone, and consistent access to highly complex and dynamic data at a lower cost per unit than the Internet with these cloud facilities. The role of data analysis in these areas requires more than a little explanation. Data analysis is a technical one that does not associate with any particular set of methods or technologies (John Krogstie, 2017
Deutsch, E. W., Bandeira, N., Sharma, V., Perez-Riverol, Y., Carver, J. J., Kundu, D. J., … & Wertz, J. (2020). The ProteomeXchange consortium in 2020: enabling ‘big data’approaches in proteomics. Nucleic acids research, 48(D1), D1145-D1152.
John Krogstie (2017), The core enabling technologies of big data analytics and context-aware computing for smart sustainable cities: a review and synthesis