Times change: while our grandparents probably wondered, with the nose contemplating a starry sky, as the solar system was extended, now the digital natives are asking the same question in relation to another universe, undoubtedly very Newer, but with due proportion, equally large, to the Internet. Answer the question, it is certainly not simple, there are several approaches to do it, but no one is able to offer an exact answer, you can proceed to estimates and approximations, starting from basically objective data. And 'the approach summarized in an interesting article published by Live Science, trying to give an answer to the question: How really big Internet ?.
The first response focuses on data obtainable from the Internet Live Stats, which provides interesting statistics on the phenomena that affect the network and, as such, gives an idea of how extensive it may be, every second are sent roughly 6,000 tweets, they are carried out approximately 40,000 Google searches and shipped over 2 million email. already impressive numbers which, however, they can only provide a partial representation of the Internet.
Another useful parameter for grasping the internet universe contours is what passes for the identification of the nodes that make it up: in 2014 the internet sites share had reached 1 billion, a figure in the constant oscillation in the light of the succession of phenomena of "Birth "and" death ", a little 'as with the stars of which we spoke at the opening. To continue using the parallelism with the real "universe", then, even in '' virtual universe of the Internet, "there are" holes blacks "that make it difficult to measure its extent. The so-called "Deep Web" or the dark area of the Internet, not indexed by Google and other search engines, hides a part of the network that you can not always identify the boundaries and content.
The judgment of the analysts - not very technical, but very pragmatic - is that the web "is large and continues to grow larger and larger." To try to give greater substance to the term "great" may be useful to consult the site WorldWideWebSize.com that not mince words immediately declares its purpose: to measure the size of the World Wide Web, using as a measure the number of pages indexed web (therefore excluding those unknowable enclosed in Deep Web). The numbers, as easy to understand, change at constant pace, but those present today (March 22) refer to no less than 4.64 billion pages.
Live Science continues his discussion by offering another way to estimate the extent of the Internet, changing once again the observation point and embracing the parameter of the amount of information traveling over the network. The source mentions, on this point, the intervention of Martin Hilbert, communications professor at the USC (University of Southern California). To understand the importance of this parameter is of interest to examine the definition of Hilbert activities undertaken by the network: "Internet stores information, communicate information and processes information." The network communication capacity can be obtained from the quantity of information that the Internet can be transferred and transfers at any one time.
The ability to "Internet" storage was estimated in 2014 with a publication in the journal Frontiers Supercomputing and Innovations, in 10 ^ 24 bytes, or, to put it differently, 1 million esabytes. To have a benchmark - remember the source - a byte is the unit of data comprising 8 bits, and is equivalent to a single character of the word of a text, a exabytes corresponds to 1 billion billion bytes.
As I said, to measure the capacity of the network communication - and therefore perceive the extension - can be useful to identify the amount of information that circulates in it concretely. In this regard, according to the analysis made by Cisco, with Visual Networking Index Iniativie, this parameter must be today expressed using the measurement units of zettabyte. According to estimates by Cisco, by the end of 2016, the global Internet traffic will reach the threshold of 1.1 zettabytes and, by 2019, that figure will cut the finish line in 2 zettabytes year.
To remember that one zettabyte equals 1 sextillion bytes, or 1000 esabytes. To give substance to the "cold numbers" may be useful to point out that one zettabyte equals the amount of data that would be consumed by watching a video in high definition for 36.000 years or those needed to take vision of the entire Netflix catalog to 3,177 sometimes (data supplied by Cisco in 2011). While, for those who love numbers, please note that the same Professor Hilbert and his colleagues, with a publication in 2011 in the Journal Science, estimated the use of Internet communication capacity 3 x 10 ^ 12 kilobits per second (maximum bandwidth, not the actual amount of data exchanged).
Attempts to measure the extension of the internet, using units of measure even more easily perceptible were not arrested even in more recent times. As recalled by Live Science, in 2015 there have been attempts to express the size of the web in physical terms, even using rather picturesque examples but can make the idea: for example, according to estimates by a group of scholars, should break down 2% of the Amazon rainforest to produce the paper needed to print the contents of the entire Web, also includes the Deep Web (study published in JIST) - a result which can be reached by estimating an average of 30 A4 pages per web page and a total 1.36 x 10 ^ 11 for the copy of the entire web.
Even in this case, however, we proceed for approximations, since only a part, although important, of the information stored by the web is represented by the text contained in the billions of sites. According to data provided by Cisco researchers, in fact, in 2015, 8000 petabytes of IP traffic per month were made up of video, compared to about 3,000 petabytes used to exchange email, web browsing and data transfer.
Understanding how extensive is now the web is a subject closely linked to trends that characterize the evolution. Hilbert points out in this regard that the Internet is growing by leaps and bounds and there is only one bank can block a veritable avalanche of information: the computing capacity grows at a faster rate than those that characterize the growth of storage capacity. Indicatively, while the storage capacity of the world doubles every three years, the computing doubles every year and a half. This means that the universe of the Internet it is extended, but always less chaotic. Think of the algorithms which already filtered the information distributed on the network and that, in the future, they will make more extensive use of artificial intelligence techniques.
News, articles, reviews, downloads, videos and articles from the world of technology.
ads
Subscribe to:
Post Comments (Atom)
Apple Vision Pro: Day One
It’s Friday, February 2, 2024. Today is the day. You’ve been eyeing the Vision Pro since Tim Cook stepped onstage with the product at last y...
-
Image Credits: TechCrunch We had to talk about the news that rocked the crypto world this week in our Thursday episode : the Binance/...
-
Welcome back to The Interchange, where we take a look at the hottest fintech news of the previous week. If you want to receive The Interchan...
-
Welcome to The Interchange ! If you received this in your inbox, thank you for signing up and your vote of confidence. If you’re reading th...
No comments:
Post a Comment