1Thirty years have passed, since the opening of the Internet to private entities and commercial use, in 1991. Over these decades, the Internet became a central element in the life of a growing percentage of the world population, weaving more and more the fabric of our lives (Castells, 2001, 1).
2How do we use the Internet? How often? For how long? For what purposes? With what consequences? Many researchers have been reflecting and working on these questions and entire fields of research have emerged (and will probably continue to emerge), around them.
3Our knowledge about online behavior in Internet research is, however, still predominantly based on self-reporting methods that imply posing questions and analyzing responses, such as surveys, individual interviews, focus groups or diary studies, for instance. Examples of Internet behavior analyses include for instance the use of surveys and individual interviews to assess the respondents’ access to the Internet, including the Internet use per week in minutes (as in Reich & Vorderer, 2013), among many others (Vorderer et al., 2016, Blank & Lutz, 2017, Reinecke et al., 2018, Pew Research Center, 2018; Banaji et al. 2018; Trindade & Duarte 2019; Hargittai & Shaw, 2020; Smahel et al., 2020; Hunsaker et al., 2020; Louro et al,. 2020; Latzer et al., 2020; to mention some of the most recent and relevant authors).
4The use of self-reporting data collection methods raises, however, several methodological issues. These issues are known and systematically explored at least since the 1930s (LaPiere, 1934) and relate, on one side, with the limitations in memorizing and recalling precisely our own behavior and, on the other side, with the complex set of relations we establish between representations and effective behavior (Foddy, 1993). Our knowledge of Internet use and online behavior risks, therefore, to be biased due to misreporting and closer to the representations we develop than to our actual behavior (Revilla et al. 2016, Guess et al., 2019).
5The existing research on Internet use and online behavior based on monitoring and observation methods is, on the other side, clearly influenced by authors coming from the industry, frequently from technology or audience measurement companies. It is the case of Bilenko & White (2008), from Microsoft Research, Adar, Teevan & Dumais (2009), from the University of Washington and Microsoft Research, Kumar & Tomkins (2010), from Yahoo! Research and Google, or Revilla, Ochoa & Loewe (2016), from Pompeu Fabra University and Netquest.
6In this context, the Living Lab on Media Content and Platforms (LLMCP) has developed a proprietary online panel of Internet users, under the project LLMCP LisPan. The research team had access, with this panel, to information about online behavior from panel participants through a web application that stores information from the web browser’s history in the laptop or desktop computer in a MySQL database, available for subsequent analysis.
7The consortium for the project LLMCP LisPan was led by ESCS (School of Communication and Media Studies), and included also the University of Aveiro, the Polytechnic Institute of Leiria, the Polytechnic Institute of Santarém and Innovation Makers, a Portuguese information technology company.
8The project received funding from Fundação para a Ciência e Tecnologia (FCT – the Portuguese funding agency for science, research and technology) (Grant LISBOA-01-0145-FEDER-023937) and was active from March 2018 to December 2019.
9The study aimed to contribute to a further understanding of online behavior, exploring Internet use among college students in Portugal, and the following research questions were considered.
10RQ1. How is Internet use distributed throughout the days of the week?
11The analysis of this question was carried out two variables: (i) the average number of users tracked on each day of the week, for six months, and (ii) the average number of web navigation actions performed by each user effectively tracked, on each day of the week.
12Since the Google Chrome extension used in the panel only records information when the user is performing a navigation action, users that do not use the web browser on a given day are not recorded with zero navigation actions for that day – they are simply missing cases, for that specific day. This is why the variation in the number of tracked users is relevant: a higher or lower average number of tracked users on some days of the week can correspond to an effectively higher or lower level of Internet use (or Internet audience) on those days.
13Considering the users that are active online and tracked by the web application on a given day, a higher or lower average number of web navigation actions during that day can correspond to an effectively higher or lower level of Internet use on that day.
14RQ2. How is the daily Internet use distributed per hour?
15Similarly to the previous research question, the analysis was based on two variables: (i) the average number of users tracked in each hourly time period, each day, for the 182 days analyzed, and (ii) the average number of web navigation actions performed by each user effectively tracked in each of these time periods.
16RQ3. Are the daily and hourly Internet use patterns of the week affected by the difference between the Permanently Online and the Permanently Connected dimensions?
17Are the patterns identified through RQ2 and RQ3 altered, when we differentiate between Internet use in the domain of the online subdimension (search and use of online content) and in the domain of the connected subdimension (online interaction with other users) (Vorderer et al. 2018, Vorderer et al. 2016)?
18RQ4. Are the daily and hourly Internet use patterns of the week affected by academic periods?
19The analyzed period, comprising 26 weeks, includes weeks with classes and weeks with no classes, due to school breaks, exams and holidays. Is the pattern of Internet use somehow different, according to these periods?
20RQ5. Can we find differences between representations and actions?
21An online survey was conducted, regarding the representations of Internet use by the panel members. Is it possible to find differences between representations and actions, comparing the results of this survey with data collected through the web application, for the same individuals?
22Data was collected through a proprietary online panel of Internet users, developed by the research team. From the end of 2018 the team initiated the development of a web application, through an extension for Google Chrome, the most popular browser among Portuguese Internet users (Statcounter, 2018), capable of collecting data regarding browsing behavior in real time (Montargil, Di Fátima, Rodrigues, & Santos, 2019; Montargil, Di Fátima & Ruiz, 2020).
23The process to become a panel member is organized in three stages. Firstly, the Internet user is asked to fill and submit an online form. Secondly, the research team analyses the profile of the Internet user, based on the data provided, and approves the new panel member. Finally, the user receives an email with instructions to download, install and activate the extension in his/her own browser.
24Once the Google Chrome extension is installed and activated, it accesses the browser’s History and collects information regarding online behavior – the most relevant variables being the visited webpage URL address, html title and access date and time. Due to an initially unexpected technical limitation, related to a bug reported and acknowledged by Google, information is only tracked through the laptop and desktop computers of panel members. Since each new accessed URL address is recorded in the browser’s History (with the exception of incognito windows), the extension identifies every new visited address and generates a corresponding record in the panel database (MySQL). This process works permanently, from the activation of the extension, if the user does not uninstall the extension or suspend its activity (there is an option that allows the user to easily and quickly suspend the extension’s activity).
25This process collects, therefore, detailed information regarding web navigation actions, allowing a comprehensive analysis of Internet behavior in laptop and desktop computers.
26Additionally, an online survey was conducted, between 7 and 15 July 2019, using Google Forms, regarding the representations of Internet use by the panel members.
27The sample was mostly constituted by students at ESCS (School of Communication and Media Studies) and in December 2019 the panel gathered a total convenience sample of around 130 registered users.
28The team has only considered in this analysis, however, the registered users (i) that are students at the school and (ii) with data collected in a period of six months (26 weeks), between January 20 and July 20, 2019, corresponding to a total of 182 days (from which 111 in periods with classes being held and 71 in school breaks).
29A total of 70 students were tracked at least once (at least one navigation action), in this period. These users are students (graduate and undergraduate) in the areas covered by the school: journalism, corporate communication, public relations, audiovisual, multimedia, advertising and marketing.
30The sample has 80% female and 20% male students, with an average age of 22,5 years (SD = 3,7, Min = 19, Max = 34), 14% with working-student status, distributed by the several courses and closely reflecting the school’s student profile.
31The sample for the online survey completed in July 2019 was of 40 respondents (not all being regularly monitored through the Google Chrome extension).
32Data analysis is based on the concept of navigation actions. A navigation action corresponds to a record stored in the MySQL panel database, fetched by the Google Chrome extension in the browser’s History.
33Two types of navigation actions are collected by the extension: web navigation actions (WNA), when a content usually accessed through the HTTP (http://...) or HTTPS (https://...) protocols is consulted and, in some cases, local navigation actions (LNA), when content stored locally in the computer is accessed (when opening downloaded pdf or image files, or consulting the browser’s settings, for instance). The information on individual navigation actions for this sample in the selected time period was stored in a main dataset, specific to this analysis, and only online browsing (WNA) was considered.
34WNA were analyzed by time period (day and hours of the day) and a measure with the number of web navigation actions per user was calculated, for each day (182 days) and each hour of each day (4.368 hourly time periods) and stored in two secondary datasets (one with data for each of the 182 days, the other with data for each of the 4.368 hourly time periods). This information was subsequently used to calculate (i) the number of users tracked in that specific period and (ii) the average web navigation actions per user (AWNAPU) for each day of the week and for each hour of the day, thus allowing us to to explore research questions (RQ) 1 and 2.
35A variable was created in the main dataset, according to the PO/PC approach (Vorderer et al., 2018; Vorderer et al., 2016), classifying each WNA as corresponding to a “permanently online” behavior (access and use of online content) or as a “permanently connected” behavior (online social interaction). Essentially, web navigation actions where the user can be considered “connected to others” were classified in the “connected” category, including the contact with other people through e-mail, Instagram, Facebook, YouTube, Twitter, LinkedIn, Reddit, Pinterest or other social network sites (SNS) (Vorderer et al. 2016, 701). Due to the large number of WNA collected, this classification was performed automatically, with a VBA macro classifying each WNA according to the visited address. Although further research is required to explore differences between “connected” (that can also be probably “considered “active””) and “online” (or “passive”) use of SNS in a more detailed way, this procedure provides a first approach, with relevant information on these dimensions of online behavior. The results were recorded in the secondary datasets (number of actions per type – online vs connected) and this variable was used to explore RQ3.
36Two other variables were created in the secondary datasets, in order to explore RQ4. One identifies the academic period (classes, periods of exams or holidays) corresponding to each day and the other only distinguishes between periods with classes and periods with no classes.
37The analysis of the online survey was completed separately, through specific procedures and software (SPSS) and then compared with the data collected through the Google Chrome extension.
38A total of 459.078 navigation actions were performed by the panel members, during the 26 weeks analyzed. From these, 453.170 (98,7%) correspond to WNA and 5.908 (1,3%) to LNA (information accessed by the browser but stored locally). Only WNA were considered in the following analysis.
39From the total of 453.170 web navigation actions, 304.440 (67%) correspond to the search and use of online content (“online” dimension) and 148.730 (33%) to online interaction with other users (“connected dimension”). Roughly two thirds of web navigation actions registered correspond therefore to the “online” dimension and one third to online interaction with other users (mainly use of social network sites and e-mail).
40Although the use of laptop and desktop devices by college students enrolled on the online panel reveals extensive Internet use throughout the days of the week and the hours of the day, some relevant differences occur.
41Statistical tests support the idea that Internet use is not constant throughout the days of the week. It is lower on Fridays, Saturdays and Sundays, measured either through the number of tracked users or through the average web navigation actions performed by each panel member that actually uses the Internet (AWNAPU), on a given day. Students do not take the weekend off and eliminate Internet use on desktop and laptop devices on weekends completely, but there is a statistically significant reduction on Fridays, Saturdays and Sundays.
42The distribution of the number of users throughout the hours of the day allows to establish the average resting time between 2.00 and 10.00 a.m. and to distinguish additionally three time periods: 1) higher intensity (10.00 a.m.-1.00 p.m., 2.00 p.m.-5.00 p.m. and 9.00 p.m.- 10.00 p.m.); 2) medium intensity (5.00 p.m.-8.00 p.m. and 10.00 p.m.-11.00. p.m.) and 3) lower intensity (1.00 p.m.-2.00 p.m., 8.00 p.m.- 9.00 p.m. and 11.00 p.m.- 2.00 a.m.).
43By introducing in the analysis the differentiation between the search and use of online content (“online” dimension) and online interaction with other users, corresponding essentially to the use of social network sites and e-mail (“connected dimension”), it is possible to verify that roughly two thirds of web navigation actions correspond to the “online” dimension and one third to the “connected” dimension. This must be seen as quite substantial, especially if we consider that this research only includes desktop and laptop computers – and not mobile devices.
44Patterns of distribution of users and of web navigation actions are not substantially altered, when considering the “online” dimension, during weekdays and the hours of the day. The “connected” dimension showcases however relevant alterations in the distribution throughout the hours of the day: online interaction tends to spread in a significantly more uniform way throughout the whole day than the search and use of online content. The reduction of Internet use intensity during the night is far less substantial in the connected dimension than in the online dimension, with active users keeping the connected intensity during the night closer to the intensity registered during the day. More than permanently online and permanently connected, users are therefore more available to be constantly (in the sense of less subject to variation) connected than to be constantly online. This is somehow close to the idea in previous research findings that “(…) the tendency to be permanently online is less pronounced than to be permanently connected” (Vorderer et al., 2016, 702) and that probably being permanently connected with others is perceived as more important than being permanently online (idem: 699) – this is now not only confirmed through the analysis of overt behavior but is also meaningful from the quantitative point of view.
45The distinction between “information” technology and “communication” technology (as opposed to ICT – Information and Communication Technologies, where the two dimensions are combined) might be useful to explore this issue in future research. Students seem to adopt some distinct patterns on the “online” and on the “connected” dimensions, suggesting that Internet use on the “online” dimension might correspond to the use of an “information” technology and, by contrast, Internet use on the “connected” dimension might correspond to a “communication” technology. Since data retrieval, analysis and processing are qualitatively different from computer-mediated communication, this would imply that we could be facing not one, but two different technologies, from the user’s perspective.
46The distinction between periods with classes and periods with no classes also relate to important changes in web use. Both the number of users and the daily average web navigation actions per user reduce in periods with no classes, when compared to periods with classes. In the case of the number of users, this reduction occurs on every day of the week. In the case of navigation actions, the reduction occurs mostly on weekends (especially Saturdays).
47Considering changes in the number of tracked users throughout the hours of the day, the rest time changes from 3.00 a.m. – 9.00 a.m. in periods with classes to 02.00 a.m. – 11.00 a.m. in periods with no classes. The resting period is therefore enlarged by 3 hours, starting one hour earlier and finishing two hours later, when students have no classes. In future research it would be interesting to examine the relationship between these behaviors, sleep routines and potential sleep disturbances (idem, Murdock, 2013).
48The distribution of web navigation actions throughout the hours of the day is also significantly affected by the existence of classes. With classes, tests confirm the existence of different periods of intensity of web navigation actions throughout the day, allowing to establish higher, medium and low intensity hourly time periods. With no classes, tests do not confirm these differences between different hours of the day and do not allow to establish different periods of hourly time periods. This tends to support the idea that students stay intentionally offline and reduce Internet use especially during holidays in order to provoke positive feelings (Vorderer et al., 2016, 699-700). The difference is statistically significant, but not very substantial, however (a reduction of around 15% in the average number of users tracked and 5% in web navigation actions).
49These trends clearly suggest that laptop and desktop use is heavily connected with academic use and, in addition, daily use, rest and sleep routines can be affected by academic activity, which deserves further attention
50Regarding the issue of possible differences between representations and actions, a specific analysis was carried out, based on the two sources of information, using the online survey done in July 2019 and data collected through the Google Chrome extension, between January 20 and July 20, 2019.
51This analysis included only Internet users who (i) answered the survey, (ii) participated in the online panel in a minimum of 24 weeks, in a total of 26 weeks considered, (iii) participated in the online panel in all of the last 6 weeks, before the survey was conducted, (iv) were students at the same higher education institution and (v) used exclusively, or almost exclusively, Google Chrome as their web browser.
52This option has allowed to consider a total of 12 users, in this specific analysis.
53All these users reported to use the Internet on a daily basis, including the weekend, in their laptop or desktop computer (cf. Table 1).
Table 1. Regularity of Internet access (survey to panel members)
Question: How often do you usually access the Internet?
(Section on Internet use through laptop or desktop computers, not including tablets or smartphones).
54However, using the information collected through the Google Chrome extension, it was possible to calculate that these users were effectively accessing the Internet 5,7 times a week (and not 7 times, as it would happen in the case of daily access, including the weekend).
Table 2. Regularity of Internet access declared in the survey, number of weeks tracked through the panel application and average number of days per week with Internet access (survey to panel members and data from Google Chrome extension)
55Only one panel member (user 8) effectively accesses the Internet daily (6,9 days a week, on average, in 26 weeks). Two other members (users 5 and 7) are close, accessing the Internet in 6,5 days a week, on average. All other panel members are usually accessing the Internet between 5 and 6 days a week.
56Although this is still a limited sample, it is possible to conclude that three-quarters of the participants over-report the frequency of Internet access. Although all the participants report that they access the Internet every day, including the weekend, only 25% effectively access the Internet every day, on average. This sample seems therefore to consider that their Internet access is more frequent than it effectively is.