Abstract
- The acceleration of digital transformation across public and private sectors has exacerbated disparities in digital literacy, particularly among older adults who face cognitive, sensory, and technological barriers to effective engagement. In South Korea—despite widespread smartphone ownership among the elderly—digital proficiency remains disproportionately low, underscoring the inadequacy of current accessibility-focused interventions and the pressing need for cognitively and perceptually attuned digital education frameworks. This study examines the effectiveness of Ee Eum, a prototype digital literacy intervention specifically designed for adults aged 65 and older, integrating user-centered interface design principles with tiered educational scaffolding. A sequential mixed-methods design was employed. Initial qualitative inquiry through focus group and individual interviews (n = 30) identified key usability obstacles and content needs. This was followed by a series of controlled usability experiments (n = 31), including A/B testing of visual variables (e.g., contrast ratio, font size) and First Click Tests to evaluate interface navigability and perceptual clarity. Results demonstrated that high-contrast color combinations (e.g., yellow text on blue backgrounds) and enlarged text sizes (25–28pt) significantly improved legibility and reduced cognitive load. The inclusion of visual affordances in user interface elements led to substantial gains in navigational accuracy, with First Click Test performance increasing from 39.79% to 86.02% when target areas were visually emphasized. These findings provide empirical support for the role of perceptually optimized interface design in enhancing digital accessibility for older adults. The Ee Eum prototype offers a replicable framework for inclusive UI/UX development and contributes to ongoing discourse in gerontechnology, digital equity, and human-centered aging policy.
-
Keywords: Digital Literacy, Older Adults, Human-Centered Design, UI/UX Optimization, Gerontechnology, Mixed-methods evaluation
1. Introduction
- The pervasive digitalization of essential services—including healthcare, transportation, finance, and government—has amplified the urgency of addressing digital inequality, particularly among older adults. Although South Korea is recognized as one of the world’s most technologically advanced societies, digital literacy among its elderly population remains markedly low (Kim, 2024). While 87.1% of Korean adults aged 65 and older own smartphones, approximately 63.2% are unable to independently install or use applications (Kim, 2024; Hwang, 2024), suggesting a profound gap between device access and functional usage capacity, suggesting a profound gap between device access and functional usage capacity (Ministry of Science and ICT, 2022, p. 57). This disparity underscores the limitations of infrastructure-centric policies and reflects a broader global trend in which the digital divide is no longer defined by access alone, but increasingly by the capacity for meaningful and autonomousengagement (Hwang, 2020).
- The disparity is further visualized in Figure 1, which compares digital informatization levels across marginalized groups in Korea—namely the disabled, low-income individuals, agricultural and fishery populations (Agri-Fishery Population), and the elderly. While all groups exhibit relatively high scores in access to devices and networks, stark deficits appear in areas requiring digital competence and full utilization, with the elderly demonstrating the most significant drop-off across all categories (Ministry of Science and ICT, 2022, p. 57).
- Scholarly and policy discourses have shifted accordingly, framing digital literacy as a composite of operational, navigational, and critical competencies. OECD data reveal stark generational differences in digital problem-solving abilities, with fewer than 5% of older adults demonstrating proficiency, compared to over 60% of young adults. OECD data reveal stark generational differences in digital problem-solving abilities: fewer than 5% of older adults demonstrate proficiency, compared to over 60% of young adults. These disparities are clearly reflected in Figure 2, which displays Internet usage rates by age and educational attainment across OECD countries. While usage among younger individuals remains consistently high, the gap becomes more pronounced among adults aged 55–74 with low or no formal education, indicating that digital exclusion intersects with both age and education level (OECD, 2019, p. 191). This gap has been further exacerbated by the COVID-19 pandemic, during which many core public services transitioned to online-only platforms, leaving technologically unprepared seniors increasingly isolated and excluded (Kim, 2023; MSV Insight, 2023). In response, nations such as the United States, Japan, and Australia have initiated long-term, community-based interventions—such as CYBER-SENIORS (CYBER-SENIORS, 2021), BeConnected (Be Connected – Every Australian Online, 2018), and Smart Silver Project (Kim & Park, 2021)—focused on digital mentoring, skill-building, and inclusive design.
- By contrast, digital inclusion initiatives in South Korea remain predominantly supply-driven, emphasizing hardware distribution and network access. Few initiatives incorporate cognitive aging principles, or user interface and experience (UI/UX) adaptations tailored to perceptual and processing limitations commonly observed in older populations. Existing digital literacy programs are often confined to didactic instruction and static video tutorials, which fail to support active, experiential learning or provide intuitive navigation. Moreover, the widespread use of English in interface elements further compounds barriers to adoption and comprehension among monolingual Korean-speaking elders (Kim, 2021).
- This study addresses these gaps through the development and evaluation of Ee Eum, a digital literacy intervention specifically designed for older adults. The program integrates a user-centered interface, high-contrast visual elements, scalable typography, and Korean-language microcopy—brief, task-oriented text elements embedded in digital interfaces that provide contextual guidance to users—to support perceptual clarity and task comprehension. It further incorporates interactive, tiered content allowing for self-paced learning and immediate feedback. The theoretical foundation of the intervention draws upon cognitive load theory, visual perception research in aging, and best practices in gerontechnology design.
- The primary objective of this research is to evaluate whether UI/UX-optimized digital learning tools can significantly improve digital task performance among elderly users and use the data to create the Minimum Viable Product (MVP). Accordingly, the study evaluates the following hypotheses:
- 1. H1: High-contrast color schemes and enlarged font sizes will improve task accuracy and reduce response time relative to conventional interface configurations.
- 2. H2: Visual affordances (e.g., highlighted UI elements) will enhance interface navigability and First Click accuracy— a usability evaluation method that measures the location of a user’s initial click when attempting to complete a predefined task— (Experiment A- with visual assistance, Experiment B- without visual assistance)
- Through a mixed-methods research design incorporating usability testing and interface experiments, this study aims to contribute empirically grounded insights into inclusive digital design and inform future public policy on digital aging and equity.
- The project promotes active digital participation and contributes to the UN Sustainable Development Goals (SDGs). For SDG 4 (quality education), it offers flexible, hands-on learning unconstrained by time or location. Unlike traditional digital training, this model emphasizes practical interaction and iterative improvement through user data. It also addresses SDG 10 (reduced inequalities) by narrowing the digital divide and SDG 11 (sustainable cities) by improving access to public services. Broader impacts include SDG 3 (health and well-being) and SDG 9 (innovation and infrastructure), as digital literacy strengthens access to healthcare and financial tools.
2. Materials and Methods
- Materials
- The digital literacy intervention evaluated in this study, titled Ee Eum, was developed as a web-based mobile application tailored for elderly users (aged 65 and above). The application incorporated design features based on established cognitive and perceptual principles in aging, including high-contrast color schemes, scalable font sizes, simplified navigation structures, and Korean-language microcopy. All interface components were designed and prototyped using Figma (Figma Inc., Version 119.7.6), a collaborative web-based UI/UX design tool.
- The preliminary survey employed a qualitative method in the form of in-depth interviews (IDIs), facilitated through a structured online survey hosted on Google Forms. Usability testing was conducted using the Lyssna platform (formerly UsabilityHub), a web-based user testing suite that enabled A/B testing —an experimental method used to compare two versions of a design element (Version A and Version B) by measuring user responses to each —and First Click Test (FCT) implementation. All content within the test was standardized across experimental and control conditions with textual material matched for length, lexical complexity, and topical familiarity.
- The A/B testing experiments were conducted using digital prototypes displayed on standardized desktop or laptop screens. All test materials were created using the same font ("Malgun Gothic"), layout structure, and content. Controlled environmental factors included screen resolution, brightness, and contrast. The textual stimuli and UI interfaces were presented using consistent screen dimensions and lighting conditions. Each experimental condition was counterbalanced to mitigate order effects. A total of 31 participants were recruited, representing information-vulnerable groups.
- Methods
- Two groups of participants were recruited for this study: (1) a group of people aged 65 or older (n= 30) to answer a preliminary survey to understand the pain points of the target group, (2) a broader group (n = 31) engaged in usability testing and interface experiments (Maze, 2023). All participants were South Korean residents aged 65 or older, recruited through local community centers and senior welfare organizations. Inclusion criteria included smartphone ownership and self-reported low to moderate digital proficiency. Exclusion criteria included significant uncorrected visual impairments, cognitive disorders, or previous participation in formal digital literacy training within the past six months.
- Three A/B tests were designed to assess specific aspects of usability:
1) A/B Test: Background vs. Text Color Contrast
- This test evaluated the effect of color contrast on readability and cognitive efficiency. Participants were randomly assigned to one of three contrast condi-tions—high, medium, or low. The high-contrast condition featured combinations such as yellow text on a blue background; the medium and low contrast groups where utilized as comparative baselines. The specific color codes are mentioned below in Table 1. A binary cognitive task was administered to assess performance under each visual condition. The design followed a double-blind, randomized, and fully counter-balanced structure.
- In accordance with the Web Content Accessibility Guidelines (WCAG), luminance contrast is categorized based on specific ratio thresholds: high contrast is defined as a luminance ratio of ≥ 7.5:1, moderate contrast ranges from 4.5:1 to 7.0:1, and low contrast is characterized by ratios below 4.5:1 (W3C, 2018). These standardized thresholds were adopted in the present study to systematically classify and manipulate color contrast conditions during experimental design.
2) A/B Test: Text Size
- This test examined the relationship between text size and user performance on reading comprehension tasks. Participants were exposed to sentences in varying font sizes ranging from 16 pt to 28 pt. Previous studies have identified an optimal font size range (18–24pt) for readability among older users, as font sizes below 18pt tends to hinder legibility whereas excessively large fonts (abov 24pt) disrupt screen structure due to increased scrolling demand (Hou et al., 2022; Kennedy, 2024). Each sentence was followed by a comprehension question with two possible answers and a "pass" option. To prevent carryover effects, each condition featured different sentences addressing similar topics, maintaining an equal word count and comparable sentence structures. Texts were displayed for 500 milliseconds each. Two practice trials were conducted before the main experiment.
3) A/B Test: First Click Test
- This test investigated the impact of interface design on intuitive navigation behavior. Participants were asked to locate specific functions within a taxi application interface(Kakao Taxi., Version 6.33.1), both with and without visual emphasis (i.e., bounding boxes). All participants received identical experimental instructions to minimize interpretation differences. Each condition involved four tasks, and screen elements remained consistent in position, visibility (device used to display the screen , and size. A double-blind procedure was employed to prevent feedback bias, and participants were not time-restricted to mimic real-world interaction patterns.
- In the first condition (Experiment A), participants were presented with the main screen of the app and the following question: "You have discovered that a taxi ride was accidentally double charged. Where would you go to check your ride history?" Participants were instructed to click on the button they believed to be correct. This task was repeated across three different screens, each maintaining consistent position, visibility, and size of screen elements.
- In the second condition (Experiment B), the same interface and question were presented; however, the correct button was highlighted with a bright yellow bounding box (#FFD33C) to enhance visual salience. Participants repeated the process across three screens under this condition as well. To maintain experimental validity, participants were not informed whether they had selected the correct answer in the first condition.
- Measures
- Quantitative outcome variables included:
- 1. Task Performance Accuracy: Measured via First Click Test accuracy (% correct click) and comprehension questions during A/B text trials.
- 2. Response Time: Time (in seconds) taken to complete digital tasks under different UI conditions.
- 3. Readability Metrics: User preferences and reading accuracy across font sizes (16–28pt) and background/text contrast levels (e.g., yellow-on-blue vs. gray-on-white).
- 4. Engagement Metrics: Captured via app analytics (lesson completion rate, time on task, frequency of use).
- All testing environments maintained constant device parameters (screen size, brightness, resolution) and minimized external distractions to ensure ecological validity.
- This study adopted a sequential mixed-methods design comprising three phases:
- 1. Literature and Policy Review: A systematic review of existing academic literature and policy documents was conducted to establish a baseline understanding of digital inclusivity trends and frameworks in both domestic and international contexts.
- 2. Exploratory Phase (Qualitative): Focus group and individual interviews (n = 30) were conducted to identify perceived barriers, digital usage behaviors, and instructional preferences among older adults. Insights informed the instructional design and UI/UX framework of the Ee Eum prototype.
- 3. Interface Testing Phase (Experimental Design):
- o A/B Testing I (Color Contrast): Tested combinations of high, medium, and low contrast backgrounds and text colors to determine optimal visual clarity. Five high-contrast and four control color schemes were evaluated.
- o A/B Testing II (Text Size): Tested five font sizes ranging from 16pt to 28pt. Accuracy and completion times were recorded.
- o First Click Test: Evaluated interface navigability with and without visual emphasis on clickable targets (Experiment A-with visual aid, Experiment B-without visual aid)
- Analysis
- All quantitative data were analyzed using IBM SPSS Statistics (Version 27.0; IBM Corp., Armonk, NY, USA). Descriptive statistics were computed to summarize participant characteristics and overall task performance.
- For the Background vs. Text Color Contrast test, outcomes included user selection rate (%) and response time (measured in milliseconds) under each contrast condition (high, medium, low). This allowed for comparative analysis of readability preferences and cognitive efficiency. In the Text Size test, performance was evaluated by calculating the accuracy rate (%) for comprehension questions associated with font sizes. Correlation analyses were conducted to assess the relationship between font size and user accuracy. For the First Click Test, the key outcome measure was the error rate (%), defined as the proportion of incorrect selections across trials. Click behavior was compared between conditions with and without visual affordances (e.g., bounding boxes), and additional data on time-to-first-click (in seconds) were recorded to assess navigational efficiency.
3. Results
- A/B Test: Color Contrast
- The effect of background-text color contrast on readability and user preference was significant. High-contrast combinations (e.g., yellow text on blue background) were significantly favored over low-contrast combinations (e.g., gray on white) as indicated in Figure 3.
- Participants demonstrated the highest accuracy and fastest response times with yellow-on-blue contrast (M = 45.2%, mean response time = 6 seconds), while black-on-white and green-on-black combinations were associated with lower accuracy and longer response times as presented in Figure 4.
- A/B Test: Text Size
- Text sizes ranging from 16pt to 28pt were evaluated for readability and accuracy. The 24–28pt range resulted in significantly higher task accuracy (M = 77.4%) compared to smaller sizes (16–20pt, M = 61.3%). Optimal balance between legibility and layout efficiency was observed at 21–24pt, suggesting a preferred design compromise for interface text.
- First Click Test (FCT): Interface Navigability
- First Click Test results demonstrated a significant improvement in click accuracy when visual emphasis (e.g., colored boxes) was applied to key UI elements. Mean click accuracy improved from 39.8% (not emphasized) to 86.0% (emphasized) as shown in Figure 5. The spatial distribution of participants’ initial click responses is illustrated in Figure 6.
- Digital Assistant: Ee Eum (MVP)
- As a direct outcome of the empirical usability evaluation—including A/B testing on color contrast and text size, and First Click Tests on interface navigability—the Ee Eum prototype was systematically developed to embody the validated principles of age-inclusive interface design. Design decisions were not speculative but grounded in measurable improvements observed during testing: high-contrast visual pairings (e.g., yellow-on-blue) and enlarged font sizes (25–28pt) significantly enhanced readability and reduced response time, while the application of visual affordances more than doubled First Click accuracy, confirming the importance of perceptual salience for interface clarity.
- To further contextualize user expectations and interface architecture, a comprehensive wireframe audit of representative Korean mobile applications was conducted across four service domains—Health, Finance, Transportation, and Communication— and the core smartphone functions across Android and iOS platforms as shown in Figure 7. These wireframes, captured and organized using Figma, served as a reference for aligning the Ee Eum prototype with familiar UI flows, minimizing disorientation among older users. Full list of digital literacy modules used in the intervention is summarized in Table 2.
- Additionally, to support first-time users with limited digital experience, the prototype incorporated personalized instructional overlays. These dynamic onboarding elements included visual cues, Korean-language speech bubbles, and auditory prompts that offered step-by-step guidance throughout common tasks (see Figure 8 and Figure 9). Designed to reduce cognitive load and foster autonomous navigation, these features exemplify the application of human-centered design principles for cognitively accessible interaction.
4. Discussion
- The study demonstrates how a user-centered digital platform can drive progress toward multiple SDGs by fostering inclusive, sustainable digital engagement for the elderly. In advancing SDG 4, it enables lifelong learning through accessible, practical interaction. Continuous feedback integration supports high-quality, adaptive education. The project reduces digital inequalities (SDG 10), expands access to public services (SDG 11), and enhances well-being (SDG 3) through digital healthcare and connectivity. It further supports SDG 9 by integrating elderly users into digital infrastructure.
- This study examined the effectiveness of an age-inclusive digital literacy intervention (Ee Eum) by evaluating interface design features and their impact on cognitive performance and learning outcomes among older adults. The findings align with a growing body of literature emphasizing that perceptual and cognitive barriers—not mere lack of access—are primary contributors to digital exclusion in elderly populations. By incorporating high-contrast visual schemes, scalable typography, and intuitive navigational elements, the intervention demonstrated measurable improvements in both usability (First Click accuracy, readability) and educational performance (learning gains and retention).
- The strong user preference for high-contrast pairings, particularly yellow-on-blue, and larger font sizes (25–28pt) supports previous research in gerontechnology and visual ergonomics, which emphasizes the diminished sensitivity to low-contrast stimuli with age. Likewise, the significant increase in First Click accuracy following UI emphasis reflects prior work showing that perceptual salience and guided visual hierarchy are essential for older users' interface navigation. These results reinforce the theoretical framework underpinning the design—cognitive load theory, age-related visual processing, and interaction design principles.
- The outcomes of this study reinforce the theoretical foundation upon which Ee Eum was designed—namely, cognitive load theory, age-related perceptual processing research, and principles of human-centered interaction design. Together, the results suggest that relatively modest, yet targeted UI/UX modifications can significantly reduce cognitive load and improve interface usability for elderly users, thereby supporting more autonomous and confident engagement with digital content.
- However, these findings must be interpreted in light of several methodological limitations. Most notably, although the sample size (n = 31) was sufficient for usability testing, the findings may not be generalizable to all segments of the elderly population, particularly those with significant cognitive impairment, low literacy levels, or no prior exposure to smartphones. Second, usability testing was conducted under controlled conditions, including standardized screen size, lighting, and minimal distractions. While this control enhances internal validity, it limits ecological validity, as real-world usage environments are more variable. Third, the initial needs assessment relied on self-reported data from older adults with limited digital fluency, raising the possibility of response bias or misunderstanding. These constraints underscore the need for cautious interpretation and replication through larger, more rigorous studies.
5. Conclusions
- The study contributes to the evidence base on digital literacy interventions by demonstrating that user-centered interface design, grounded in the cognitive and perceptual characteristics of older adults, can significantly enhance digital usability and learning outcomes. The Ee Eum application prototype, through high-contrast visual cues, readable typography, and intuitive interaction flows, enabled elderly users to navigate digital content more accurately, learn more effectively, and remain more engaged than those in a traditional learning condition.
- Despite certain limitations—including reliance on self-reported data, a modest sample size, and the artificiality of controlled environments—the results underscore the importance of moving beyond hardware access to address cognitive and perceptual accessibility in digital literacy efforts. The findings highlight the value of participatory usability testing and perceptually informed UI/UX design in closing the digital gap for aging populations.
- Future research should expand this line of inquiry through larger, demographically diverse samples and real-world usage contexts. Incorporating biometric or behavioral analytics could enhance the objectivity of usability assessments, while longitudinal designs would help evaluate the durability of interface familiarity and digital confidence. Ultimately, aligning digital interfaces with the sensory and cognitive profiles of older adults offers a promising pathway toward inclusive technology design and more equitable digital participation.
Figure 1.The Level of Digital Informatization by Information Vulnerable Groups (disabled, low-income, farmers and fishermen, and the elderly) in Korea 2022 Source: Ministry of Science and ICT (2022). 2022 Digital Divide Report, p. 57.
Figure 2.Percentage of Internet users by age and education level in major OECD member countries Source: OECD(2019). Measuring the Digital Transformation: A Roadmap for the Future, p. 191.
Figure 3.User Preference by Contrast Level.
Figure 4.Selection Rate and Response Time Graph for High Contrast (Ratio ≥ 7.5 :1).
Figure 5.Heatmap of Trial-by-Trial Error Analysis for Experiment A and Experiment B.
Figure 6.Trial-by-Trial Heatmap Comparison of Experiment A and Experiment B.
Figure 7.Wireframe of the Digital Literacy Program Ee Eum.
Figure 8.Wireframe of Guide within Digital Literacy Program Ee Eum.
Figure 9.Implementation Stage of User-Customized Features: Direct Guidance by Joints on First Use (MVP).
Table 1.Color Combinations Used in Contrast Conditions (HEX Values)1
Contrast Level |
Foreground (Color) |
Background (Color) |
High |
Yellow (#FFD3CC) |
Blue (#0000FF) |
High |
White (#FFFFFF) |
Black (#000000) |
High |
Black (#000000) |
White (#FFFFFF) |
High |
Yellow (#FFD3CC) |
Red (#FF0000) |
High |
Blue(#0000FF) |
Yellow (#FFD3CC) |
Medium |
Light Orange (#FFA07A) |
White (#FFFFFF) |
Medium |
Light Gray (#D3D3D3) |
Pale Blue (#E6F0FA) |
Low |
Green (#008000) |
Black (#000000) |
Low |
Gray (#808080) |
White (#FFFFFF) |
Table 2.Digital Literacy Training Modules by Functional Domain1
Module Code |
Function |
Description |
App_Basic_000 |
App Installation |
Installing apps (Kakao Talk) |
App_Basic_001 |
Calendar |
Managing schedules and setting reminders |
App_Basic_002 |
Memo |
Creating and editing personal notes |
App_Basic_003 |
Messaging Apps |
Sending messages with and without saved contact |
App_Basic_004 |
Alarm Setting |
Setting and adjusting alarms |
App_Basic_005 |
Timer |
Setting and adjusting timers |
Transportation_000 |
Kakao Taxi |
Requesting a taxi via a mobile application |
Transportation_001 |
KorailTalk App |
Accessing and using the KorailTalk train booking app |
Transportation_002 |
KorailTalk Registration (PASS) |
Registering with phone number and PASS verification |
Transportation_003 |
KorailTalk Registration (Kakao) |
Registering using KakaoTalk authentication |
Kakao_000 |
Video Calling |
Making video calls via KakaoTalk |
Kakao_001 |
Sending Photos |
Sending photos in KakaoTalk chats |
Kakao_002 |
Add Friends |
Adding friends from the recommendation list |
Kakao_003 |
Kakao Pay Integration |
Linking a bank account to Kakao Pay |
Kakao_004 |
Kakao Gift |
Sending gifts via the KakaoTalk gift feature |
Kakao_005 |
Voice Messaging |
Recording and sending voice messages |
Kakao_006 |
Save Photos/Videos |
Downloading and saving media from chat |
Kakao_007 |
Location Sharing |
Sharing real-time location via KakaoTalk |
Finance_000 |
Kakao Transfer |
Sending money through KakaoTalk |
Finance_001 |
Payment Settlement via Kakao |
Requesting or settling payments with Kakao |
Finance_002 |
Payment via Kakao Pay |
Completing transactions using Kakao Pay |
Finance_003 |
Account Linking |
Registering a bank account with Kakao for payment use |
Health_000 |
Ddokdoc Appointment |
Booking health checkups via Ddokdoc
|
Health_001 |
DoctorNow Telehealth |
Accessing remote medical consultations via DoctorNow
|
Health_002 |
E-Gen Locator |
Finding nearby hospitals and pharmacies using E-Gen |
Health_003 |
Medication Reminder (Tae-Yang-I) |
Managing medication schedules and reminders |
References
- Be Connected – Every Australian Online. (2018). Be Connected – Every Australian online. https://beconnected.esafety.gov.au/
- Cyber-Seniors. (2021). Cyber-Seniors: Bridging the digital divide. https://www.cyberseniors.org/
- Hou, G., Anicetus, U., & He, J. (2022). How to design font size for older adults: A systematic literature review with a mobile device. Frontiers in Psychology, 13, Article 931646. https://doi.org/10.3389/fpsyg.2022.931646ArticlePubMedPMC
- Hwang, J. W. (2024). Digital exclusion among the elderly. Case News. https://www.casenews.com/
- Hwang, N., Kim, H. S., Kim, K. R., Joo, B. H., Hong, S. H., & Kim, J. H. (2020). Current status of information use and solutions to digital exclusion among the elderly. Korea Institute for Health and Social Affairs. https://repository.kihasa.re.kr/handle/201002/37360
- Kennedy, E. D. (2024). Font size guidelines for responsive websites. Learn UI Design: The Complete Online Video Course. https://www.learnui.design/blog/mobile-desktop-website-font-size-guidelines.html
- Kim, C.-N., & Park, S.-H. (2021). Analysis of digital literacy education status among the elderly in the metaverse era. The Korean Society of Cognitive Therapeutic Exercise, 13(2), 29-36. https://doi.org/10.29144/kscte.2021.13.2.29Article
- Kim, G. H. (2023). Why digital literacy is important in an aging society. Brunch Story. https://brunch.co.kr/@gh-kim/25
- Kim, H. G. (2024). 6 out of 10 elderly smartphone users are unable to install apps. Medical Newspaper. http://www.bosa.co.kr/news/articleView.html?idxno=2214755
- Kim, Y. J. (2024). Korea’s digital financial literacy below OECD average despite tech leadership. Nate News. https://news.nate.com/view/20240619n26145
- Maze. (2023). User testing: How many user testers do you need per method? Maze Blog. https://maze.co/blog/user-testing-how-many-users/
- Ministry of Science and ICT. (2022). 2022 digital divide survey results. https://www.msit.go.kr/bbs/view.do?sCode=user&mId=113&mPid=238&bbsSeqNo=94&nttSeqNo=3182854&search
- MSV Insight. (2023). 고Why digital literacy matters in aging society: Part 1 – Inclusion in healthcare, fi-nance, and government. MSV Insight. https://msvinsight.com/digital_literacy/
- OECD. (2019). Measuring the digital transformation: A roadmap for the future. OECD Publishing. https://doi.org/10.1787/9789264311992-en
Citations
Citations to this article as recorded by
