By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: Transmission of cultural knowledge and linguistic scaffolding | HackerNoon
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > Computing > Transmission of cultural knowledge and linguistic scaffolding | HackerNoon
Computing

Transmission of cultural knowledge and linguistic scaffolding | HackerNoon

News Room
Last updated: 2025/02/26 at 10:38 AM
News Room Published 26 February 2025
Share
SHARE

Authors:

(1) Raphaël Millière, Department of Philosophy, Macquarie University ([email protected]);

(2) Cameron Buckner, Department of Philosophy, University of Houston ([email protected]).

Table of Links

Abstract and 1 Introduction

2. A primer on LLMs

2.1. Historical foundations

2.2. Transformer-based LLMs

3. Interface with classic philosophical issues

3.1. Compositionality

3.2. Nativism and language acquisition

3.3. Language understanding and grounding

3.4. World models

3.5. Transmission of cultural knowledge and linguistic scaffolding

4. Conclusion, Glossary, and References

3.5. Transmission of cultural knowledge and linguistic scaffolding

Another interesting question is whether LLMs might engage in cultural acquisition and play a role in the transmission of knowledge. Prominent theorists have suggested that the key to human intelligence lies in a unique set of predispositions for cultural learning (Tomasello 2009). While other primates may share some of these dispositions, these theorists argue that humans are uniquely equipped to cooperate with one another to acquire and transmit knowledge from one generation to the next. Tomasello has explained the uniquely human capacity for cultural learning in terms of a “ratchet effect,” a metaphor to the ratcheting wrench which clicks into place to hold its position each time it is further turned in the desired direction. Chimpanzees and other animals, Tomasello argues, can learn in many of the same ways that humans do, and even acquire regional differences in their problem-solving strategies, such as different troops using different tool-making techniques to fish for termites. However, he claims that only humans can pick up right where the previous generation left off and continue making new progress on linguistic, scientific, and sociological knowledge. This constant ratcheting is what allows a steady progression of human knowledge accumulation and discovery, compared to the relatively stagnant cultural evolution of chimpanzees and other animals.

Given that deep learning systems already exceed human performance in several task domains, it is interesting to ask whether LLMs might be able to emulate many of these components of cultural learning to pass on their discoveries to human theoreticians. For instance, humans are already reverse-engineering the strategies of AlphaZero to produce mini-revolutions in the explicit theory of Go and chess (Schut et al. 2023). Similarly, latent knowledge in specialized domains such as materials science can be extracted even from a simple word embedding model Tshitoyan et al. (2019). In these instances, it is primarily humans who are synthesizing and passing on culturally-transmissible knowledge by interpreting the model’s outputs and internal activations. This human-led interpretation and transmission underscore a crucial aspect of cultural ratcheting: the ability to not only generate novel solutions but to also understand and communicate the underlying principles of these solutions, thereby enabling cumulative knowledge growth.

Could LLMs ever explain their strategies to humans in a theoretically-mediated way that participates in and enhances human cultural learning? This question is directly related to whether LLMs can genuinely generalize to out-of-distribution (OOD) data. As discussed in section 3.1, there is converging evidence that Transformer-based models may generalize compositionally under some train-test distribution shifts.[14] But the present issue intersects with a different kind of generalization – the ability to solve genuinely novel tasks. To borrow from Chollet (2019)’s taxonomy, we can distinguish between local task generalization, which involves handling new data within a familiar distribution for known range of tasks; broad task generalization, which involves handling new data under modest distribution shift for a wide range of tasks and environments; and extreme task generalization, which required handling new data for entirely novel tasks that represent a significant departure from any previous data distributions. Current LLMs seem able to master a wide variety of tasks that are reflected in their current training sets; as such, they exhibit at least local task generalization, if not broad task generalization. However, like chimpanzees that learn from observing their troop mates, they often seem to have a hard time pushing beyond the range of tasks well-represented in their training data McCoy et al. (2023).

Furthermore, the ratcheting effect crucially involves stable cultural transmission in addition to innovation. Can LLMs, like humans, not only generate novel solutions but also “lock in” these innovations by recognizing and articulating how they have advanced beyond previous solutions? Such a capability would involve more than just the generation of novel responses; it necessitates an understanding of the novelty of the solution and its implications, akin to human scientists who not only discover but also theorize, contextualize, and communicate their findings. The challenge for LLMs, therefore, lies not merely in generating novel solutions to problems but also in developing an ability to reflect on and communicate the nature of their innovations in a manner that contributes to the cumulative process of cultural learning. This ability would likely require some of the more advanced communicative intentions and world models (such as causal models) discussed in previous sections. While LLMs show promise in various forms of task generalization, their participation in the ratcheting process of cultural learning thus appears contingent on further advancements in these areas, which might lie beyond the reach of current architectures.


[14] For a systematic discussion of different aspects of generalization research in NLP, including different types of distribution shift, see Hupkes et al. (2023).

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article The Galaxy Buds 3 Pro have a major charging problem
Next Article 2FAS Review
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

Shares of China’s Xpeng surge after 30,000+ orders for new P7+ sedan · TechNode
Computing
Where to stream every Mission: Impossible movie before The Final Reckoning comes to theaters
News
Experts call Kennedy’s plan to find autism’s cause unrealistic
News
Robby Starbuck calls Meta apology 'sort of bizarre' amid defamation lawsuit
News

You Might also Like

Computing

Shares of China’s Xpeng surge after 30,000+ orders for new P7+ sedan · TechNode

1 Min Read
Computing

Oppo Find N5 to integrate DeepSeek-R1 for direct voice interaction · TechNode

1 Min Read
Computing

Shareholders of China’s state-owned automakers Changan and Dongfeng mull restructuring · TechNode

1 Min Read
Computing

Elon Musk comments on China’s DeepSeek at WELT summit · TechNode

1 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?