Author Topic: Truth =/= knowledge  (Read 935 times)


  • Hero Member
  • *****
  • Posts: 563
    • View Profile
Re: Truth =/= knowledge
« on: May 09, 2021, 02:39:26 am »
We should use this word more! Every time rightists claim that they are being "censored" by [insert platform here], we can tell them they are in fact being curated


"I discovered these old scientists had better arguments against ethno-tribalism than present-day biologists. Yet no mainstream biologists and sociologists seem to know about this"

Please present these somewhere when you have time.

I posted it here:'black'-and-'white'-identity-politics-scam/msg6246/#msg6246

What you are describing could be called informational inflation.

Yes, that's a great idea.


I forgot to include this as well! Again, quantity of knowledge is not enough to satisfy them. Quantity of distributed information (via academically-published journal articles) is more important than the creation of the knowledge itself. The perception/consensus that the publication contains accurate knowledge (via the peer review process, which isn't really effective at ensuring the information is accurate knowledge) is more important than whether or not anyone can understand and confirm if the knowledge is accurate:

"Publish or perish" is an aphorism describing the pressure to publish academic work in order to succeed in an academic career.[1][2][3] Such institutional pressure is generally strongest at research universities.[4] Some researchers have identified the publish or perish environment as a contributing factor to the replication crisis.

Successful publications bring attention to scholars and their sponsoring institutions, which can help continued funding and their careers. In popular academic perception, scholars who publish infrequently, or who focus on activities that do not result in publications, such as instructing undergraduates, may lose ground in competition for available tenure-track positions. The pressure to publish has been cited as a cause of poor work being submitted to academic journals.[5] The value of published work is often determined by the prestige of the academic journal it is published in. Journals can be measured by their impact factor (IF), which is the average number of citations to articles published in a particular journal.[6]

To really drive this point into the ground:

In academic publishing, the least publishable unit (LPU), the smallest measurable quantum of publication, the minimum amount of information that can be used to generate a publication in a peer-reviewed venue, such as a journal or a conference. ...The term is often used as a joking, ironic, or derogatory reference to the strategy of artificially inflating quantity of publications.

They have apparently even created new subfields dedicated solely to salivating over how much information they create:

Citation impact is a measure of how many times an academic journal article or book or author is cited by other articles, books or authors.[1][2][3][4][5] Citation counts are interpreted as measures of the impact or influence of academic work and have given rise to the field of bibliometrics or scientometrics,[6][7] specializing in the study of patterns of academic impact through citation analysis.

Imagine living 1,000 years ago when you could every single thing that was ever deemed high enough quality to merit scribes spending hundreds of hours copying it onto very expensive paper, and judge its quality and "impact" for yourself.

A communist critique of the current information addiction, although it raises some important points:

Historian Russell Jacoby, writing in the 1970s, observes that intellectual production has succumbed to the same pattern of planned obsolescence used by manufacturing enterprises to generate renewed demand for their products.

    The application of planned obsolescence to thought itself has the same merit as its application to consumer goods; the new is not only shoddier than the old, it fuels an obsolete social system that staves off its replacement by manufacturing the illusion that it is perpetually new.[6]

Jacoby laments the demise of the radical critical theory of the previous generation, which sought to understand and articulate the contradictions inherent in bourgeois and liberal democratic ideologies. The new generation of theories, in contrast, seek to allow the contradictory elements of the ideology to coexist by isolating them, assigning them to separate departments in the university. This division of intellectual labor in the service of the prevailing ideology, Jacoby says, "severs the life nerve of dialectical thought."[7]'s_criticisms_of_contemporary_academia

The last paragraph is a very important point. In the ancient world, "philosophy" included the study of all subjects. A skilled philosopher was well-acquainted with every field of knowledge. At some point after the Renaissance, there was so much knowledge that not even the smartest and most dedicated scholar could possibly learn and understand everything.

Fields became specialized and compartmentalized. A scholar could not be an expert in every subject, although they could still probably learn much from each subject if they had the interest. Today, we have basically reached "hyper-specialization". Scientists who study one specialization within a subject are not even able to understand the concepts and the real meaning of the data used in other specializations within their subject! It is frequently the case that scientists will present their work at an academic conference (to an audience of other scientists in their field), and often no one outside of their hyper-specialization will understand their presentation at all.

For example, here are subfields within physics. I doubt that an expert in a subfield would be able to understand the cutting-edge research from another subfield within their own "field".

Here's a similar list for biology. I think this would only be equivalent to the "field" classification on the physics Wikipedia page.

From the ancient world, starting with Aristotle, to the 19th century, natural philosophy was the common term for the practice of studying nature. It was in the 19th century that the concept of "science" received its modern shape with new titles emerging such as "biology" and "biologist", "physics" and "physicist" among other technical fields and titles; institutions and communities were founded, and unprecedented applications to and interactions with other aspects of society and culture occurred.[1]
The term natural philosophy preceded current usage of natural science (i.e. empirical science). Empirical science historically developed out of philosophy or, more specifically, natural philosophy. Natural philosophy was distinguished from the other precursor of modern science, natural history, in that natural philosophy involved reasoning and explanations about nature (and after Galileo, quantitative reasoning), whereas natural history was essentially qualitative and descriptive.

In the 14th and 15th centuries, natural philosophy was one of many branches of philosophy, but was not a specialized field of study. The first person appointed as a specialist in Natural Philosophy per se was Jacopo Zabarella, at the University of Padua in 1577.

Modern meanings of the terms science and scientists date only to the 19th century. Before that, science was a synonym for knowledge or study, in keeping with its Latin origin. The term gained its modern meaning when experimental science and the scientific method became a specialized branch of study apart from natural philosophy.[2]

I also came across this, which may be some food for thought in examining alternatives to the present-day approach to science: