Scientific Progress Goes ELI5
Table of Contents
GENESIS OF THE PROJECT
General Approach / What We Are Doing Here
1. This Is not a Journal
2. Keep it concise
3. Keep it I N T E R E S T I N G
4. Share this doc with people who actually know their stuff!!
5. ON DISAGREEMENTS
6. Post-1970
7. Any interesting thoughts/suggestions?
8. Suggested resources to reference and examine, etc.
Mathematics
Foundational Progress in Mathematics Since 1970
Open questions:
Formalisms (turning hard ideas into simple symbolic manipulation/algebra):
Physics
Progress:
Open questions:
Chemistry
Progress
Open questions
Biology
Progress:
Open Questions:
Evolution and origins of life (copied from Wiki)
Medicine
Progress
Open Questions
Neuroscience
Progress
Open Questions
Computer Science
Progress
Open Questions
Methodologies
Psychology
Progress
Open Questions
Methodology
Progress
Open Questions
Methodology
Economics
Progress
Open Questions
Methodology
History
Progress
Open question
Anime
Progress
Open questions:
Methodology:
Philosophy of Science
Open questions
Methodology
GENESIS OF THE PROJECT
@juliagalef Dec 1, 2021
Here's something I wish existed...
For each scientific field, a ELI5-style summary of the
- most important progress made in the last 50 years
- biggest open questions people are working on
- typical methodology used (e.g. case studies, surveys, looking for correlations in data)...
I'm also very interested in the epistemic status of each field. (The methodology question is a way of getting at this.) Like, how much can I trust a random paper in field X[a]? How much can I trust a consensus in field X[b]? Etc….
Whoa, people are actually doing this!
@2seema1 Dec 1, 2021:
Is there a reason you want this?
@juliagalef Dec 2, 2021
Largely just curiosity - I wish I had a better bird's eye view of science today. But also (re: the methodology item on my list) because I wish I knew how skeptical I should be of claims from each field.
@juliagalef Dec 2, 2021
Wikipedia's nice, but it lists a lot of research directions and doesn't give a clear sense of how important or solid each one is…Take neuroscience, for example. My impression is that fMRI research has been very central to neuroscience, and is of dubious quality. This seems very important, but the Wikipedia page on neuroscience barely mentions it. Here's the only mention of fMRI on the whole page:
@visakanv Dec 1, 2021
this seems like something we could make happen with a few beers and a google docs…

Kurzgesagt is a science youtube channel with 17 million subscribers!!
General Approach / What We Are Doing Here
This Is not a Journal
- It is what it is, literally just a google docs that a buncha random internet nerds are info-dumping stuff into
- Use LLMs to expand/organize the document[c], but be sure to double-check their output with specific claims from reputable sources.
Keep it concise
- To stir up interest in the early stage, it’s ok to search for stuff and add stuff – try to populate everything to a moderate amount? Debatable
- Let’s not let any one section suddenly spiral into 50 pages or something, that would get weird. Try to keep things interesting
Keep it I N T E R E S T I N G
- Post genuinely interesting/compelling links.
- Remember, the point of a doc like this is not to be terminally exhaustive (which is just exhausting, ha ha) but to evoke people’s attention and interest
- Counterpoint - there is a genre of educational content, intellectual blog content, that mystifies or overlooks contradictions and complications in a field of knowledge through an (ultimately meaningless) but highly palatable or appealing range of 'wow that's so interesting'-type observations. Some specialize in this pseudothinking as an art (e.g. Malcolm Gladwell-esque). Can give the illusion of depth but ultimately very superficial or just blatantly wrong. Instead, strike a balance between critically important AND captivating, but never sacrifice true understanding for digestibility.
- “Ultimately meaningless” is hubristic to assume. “Wow that’s so interesting” is a good starting point from which people can explore further and increase their depth of understanding. If something is superficial or wrong, that can be pointed out, addressed, corrected.
Share this doc with people who actually know their stuff!!
- The more experienced eyes on this the better.
ON DISAGREEMENTS
- What’s the best practice if we see something that we disagree with especially in our fields (but not necessarily domains) of expertise?
- I would do something like, sub-bullet, “Disagree – [counter-evidence/argument]”. The disagreement itself would be interesting. This whole thing is kind of running on a trust/honor system…
- Yeah, this seems reasonable at least for now.
- Will be easy for now as we stick to “harder” fields, but as Economics and other fields are added there might need to be a process.
- Maybe we should have a discussion section lol.
- Warms my heart to see that this is actually kinda-sorta taking off. Go nerds!
Post-1970
- As a reference, anything after 197[d]0 should be included here. Of course things don't develop instantly so imo this should only be a rough reference point.
- Yes, I know this isn’t technically 50 years. Documents are human (mostly), and require flexibility.
Any interesting thoughts/suggestions?
- It’s interesting that there are some wikipedia pages for some of these topics – eg: unsolved problems in Neuroscience. It might be worthwhile (do we have any Wikipedians here?) for the long run / long game to try and seed the relevant wikipedia pages for the other fields and topics, if applicable<
- Would it be better to have sub categories/fields? Since in physics alone there have been advances in astro, particle, quantum, let alone math, medicine, chemistry etc. It could make the document more readable as opposed to just info dumping in sections. A problem might arise with how niche a field is, but for now I think there are some broader fields that could be expanded upon.
- Seems reasonable
- I was thinking the same—Would like to keep contributing and create a psychotherapy subsection of psychology (10 likes and I’ll do it <- [1]).
- Tip: Copy-pasting stuff does not make it ELI5
- ^ that’s fine, this is a collaborative doc. Set things up and let people improve on things
- May also be wise to “fork” a copy[e] of the doc that’s more edited, since the emerging norm here seems to be “just keep adding stuff”
- Similar to above -- does engineering fit here? I was thinking about doing a quick section on photovoltaics.
Suggested resources to reference and examine, etc.
- Feel free to add autodidact resources to the different topics
- Would love to introduce a section above where we could show the intersection of the different fields (Ex. computational neuroscience, philosophy of science)
- Feel free to add more sections if necessary!
Mathematics
Foundational Progress in Mathematics Since 1970
The period since 1970 has witnessed remarkable progress in mathematics, characterized by the resolution of centuries-old conjectures, the development of powerful new theories, and the increasing influence of computational methods. These advancements often reveal deep and unexpected connections between different mathematical domains. Mathematics is by far the most healthy scientific discipline today with incredible progress and breakthroughs happening every year in a lot of different fields. Moreover the math community seems to be very open to new ideas and new innovations as long as they are “mathematically interesting”.
- Resolution of Fermat's Last Theorem (FLT)[f]: Andrew Wiles' proof, announced in 1993 and finalized in 1994, resolved the 350-year-old Fermat's Last Theorem, which states no positive integers a,b,c can satisfy an+bn=cn for n>2. Wiles achieved this by proving a significant part of the Taniyama-Shimura-Weil conjecture for semistable elliptic curves, demonstrating that a hypothetical counterexample to FLT would lead to a contradiction, thereby showcasing profound links between number theory and algebraic geometry.
- Development of Public-Key Cryptography (RSA): The RSA algorithm, invented in 1977 by Rivest, Shamir, and Adleman, as well as the Diffie-Hellman key exchange[g] (Diffie, Hellman, Merkle) used in public key cryptography revolutionized secure communication. When combined, a message’s security relies on the difficulty of factoring large integers, a problem from number theory. became a cornerstone of modern digital security for protocols like TLS/SSL and digital signatures.
- Progress in the Langlands Program: Initiated by Robert Langlands in the late 1960s, this vast web of conjectures proposes deep unifying connections between number theory, algebraic geometry, and representation theory. Significant advances include Andrew Wiles' proof of modularity for semistable elliptic curves (critical to resolving Fermat’s Last Theorem), Laurent Lafforgue's work establishing the Langlands correspondence for GL(n) over function fields, and Ngô Bảo Châu's proof of the Fundamental Lemma. Additionally, Vincent Lafforgue's groundbreaking 2018 work on the global parametrization of automorphic forms using shtukas significantly extended the Langlands correspondence for general reductive groups over function fields, earning him the 2019 Breakthrough Prize in Mathematics. These developments collectively confirm and extend substantial portions of Langlands' visionary framework.
- Impact of Optimal Transport (OT): Optimal Transport theory, originally formulated by Gaspard Monge in the 18th century, has undergone remarkable growth and theoretical deepening since the early 2000s. Notably, Cédric Villani's seminal contributions, which earned him the 2010 Fields Medal, rigorously connected OT to geometric analysis, partial differential equations (PDEs), and functional analysis, providing a profound theoretical foundation for the subject. Concurrently, practical innovations like the Sinkhorn algorithm accelerated OT's computational feasibility, triggering an explosion of applications across machine learning and data science. Today, OT's geometric insights—particularly via the Wasserstein distance—are fundamental in fields such as generative modeling, image processing, natural language processing, and statistical inference, reflecting a harmonious synthesis of theoretical rigor and practical utility.
- The Riemann Hypothesis (RH)[h] continues to be one of the most significant unsolved problems in mathematics, deeply connected to the distribution of prime numbers. While it remains unproven, research is very active. For instance, work by Griffin, Ono, Rolen, and Zagier (PNAS, 2019) made progress on Jensen polynomials, an approach related to the RH, showing that many of these polynomials have real roots, which is a condition implied by the RH. More recent surveys (e.g., Preprints.org, 2024) indicate a steady stream of new contributions and attempts to prove the RH, exploring various avenues like zero-free regions, the Hilbert-Pólya conjecture, and Keiper-Li's criterion. As of early 2025, several new papers continue to explore facets of the RH, such as new expressions for the Zeta function or integral relations, though a definitive proof is still elusive. The Clay Mathematics Institute continues to list the Riemann Hypothesis as one of its Millennium Prize Problems.
- The Sum of Three Cubes – This one is some seriously ancient math. Diophantine equations are named after Diophantus of Alexandria, a 3rd century mathematician. Two particular Diophantine equations, including the one seen in this photo, evaded mathematicians until 2019. The breakthrough was enabled by the latest tech in shared computer power.
- The Collatz Conjecture – Another of math’s biggest open problems jumped closer to a resolution this year. The improved results posted by prolific mathematician Terrence Tao rocked the math community. Even after Dr. Tao’s latest insights, the problem remains unfinished, and could still take years to solve.
- Posed in 1994, the Sensitivity Conjecture became a major unresolved question in mathematical computer science. That ended this year, thanks to Professor Hao Huang of Emory University. In a frenzied few weeks following the initial announcement, scientists digested Dr. Huang’s proof down to a single page of brilliance[i].
- Mathematicians are always looking for ways to help in the fight against cancer. The year started with this joint work by mathematicians and biologists. Innovative math modeling helped guide their experiments on cell growth. Then came this research, which used math models to gain new insight on how breast cancer metastasizes.[j]
- Proof of the Poincaré Conjecture: Driven by the program first started by Hamilton and using Ricci Flow, Grigori Perelman sketched the path to solve the Geometrization Conjecture and as a consequence the Poincaré Conjecture. As a consequence of this result, Perelman received the Millennium Prize. Interestingly Ricci Flow has been used recently in the computer graphics community for the development of Geometric algorithms.[k]
- Optimal Transport (OT) [l]has solidified its role as a powerful framework in machine learning (ML) for comparing and manipulating probability distributions, with applications continuing to expand rapidly between 2020 and 2025. Its use as a loss function (e.g., Wasserstein distance) is preferred in many scenarios like generative modeling due to its favorable geometric and topological properties over alternatives like KL divergence. Recent advancements focus on computational efficiency (e.g., Sinkhorn algorithm variations, low-rank approximations) and extensions like partial, unbalanced, Gromov, and Neural Optimal Transport. OT is now integral to supervised, unsupervised, transfer, and reinforcement learning. Specific applications include image processing and alignment, natural language processing for text similarity, style transfer in deep learning (e.g., by NVIDIA), and even in genomics for analyzing single-cell multi-omic datasets. Workshops and research initiatives (e.g., NeurIPS OTML workshops, Humboldt-Universität zu Berlin OT-DOM 2024) highlight its ongoing development and cross-fertilization with areas like dynamical systems and optimization.
- Theorem Proving Community: In the last couple of years the field of automated theorem proving has gained a lot of popularity thanks to the “evangelizer” Kevin Buzzard and the Lean proof assistant. Not only classical and modern results have been checked like the continuum hypothesis, but contemporary mathematics is also being developed in this area. Of particular importance is the formalization of the work of Peter Scholze. Of course this work builds on decades of previous work by a lot of automated proving communities, but since math is often driven by popularity, this is a great technical and sociological breakthrough.[m]
- The Rise of Computational Proof, Formal Methods, and Interactive Theorem Provers: Since the late 20th century, mathematics has increasingly embraced computer-assisted proofs and formal verification methods. Landmark results like Appel and Haken's 1976 proof of the Four Color Theorem and Thomas Hales' 1998 proof of the Kepler Conjecture, later rigorously verified by the Flyspeck project (completed in 2014), demonstrated the practical necessity of computational support for complex problems. Concurrently, the emergence and adoption of interactive theorem provers such as Coq, Isabelle/HOL, and Lean have significantly enhanced the reliability and logical transparency of intricate mathematical proofs. These tools now underpin cutting-edge research, exemplified by recent formalizations of advanced concepts, including portions of Peter Scholze's groundbreaking work on condensed mathematics. Together, these developments reflect a broader trend toward deep human-computer collaboration, ensuring logical rigor in mathematics amid rising complexity.
Open questions:
- Note: Maths is full of open questions. Which questions are considered important is often determined by popularity; this differs from other subjects where practical use is the main consideration
- What's going on with the distribution of the primes (eg the Riemann hypothesis)
- Lots of work in algebraic geometry (one of the most popular areas, most recent Fields medalists work in this area)
- Question: could we consider breaking open questions out into sub-fields of mathematics and keeping the list for each subfield extremely short (1-2 points per field)? This could be a good way to break it out
- Analysis, algebra, number theory, combinatorics (or maybe set theory), geometry, topology, and category theory seem the most pertinent from the above linked list but I’m a hobbyist not a researcher so could be very wrong
- Seems like a good idea
Formalisms (turning hard ideas into simple symbolic manipulation/algebra):
| The question you raise "how can such a formulation lead to computations" doesn't bother me in the least! Throughout my whole life as a mathematician, the possibility of making explicit, elegant computations has always come out by itself, as a byproduct of a thorough conceptual understanding of what was going on. Thus I never bothered about whether what would come out would be suitable for this or that, but just tried to understand – and it always turned out that understanding was all that mattered.
Alexander Grothendieck (in a letter to Ronnie Brown, 12.04.1983) |
- Categorical foundations for topology and combinatorics via Locale theory and Species respectively (neatly subsumes the symbolic method of generating functions)
- Jet bundles for Partial Differential Equations, they along with some other formalism connect the geometry of PDEs as surfaces to algebraic tools like galois theory
- Cohomology and categorification of arithmetic, turning equations of numbers into relationships of spaces, providing more detail about the configurations of things
- Numerical Methods (like numerical differential equations) as formal algebra (see the Butcher group and the Connes-Kreimer Hopf Algebra)
- Algebraic Geometry, primes as points/ideals, relative geometry, GAGA connections to calculus/analysis, distinction between real (characteristic 0) and positive characteristic geometry
- To summarize much of the above: the rise of category theory as a unifying foundational framework/viewpoint for most algebraic, geometric, and topological aspects of mathematics. This has been a huge seismic shift in many areas of mathematics, and has impacted the way math is done perhaps more than any other development in the past ~century. More recently, ∞-categories have emerged, giving a "homotopical enhancement" of ordinary categories. ∞-category theory is a hot topic right now, and has shown great promise in enabling mathematicians to ask the right questions and get the right answers in many fields (e.g. algebraic geometry, homotopy theory).
Physics
Progress:
- Quantum Teleportation (←there’s still some debate ongoing, if the experiments actually did demonstrate teleportation, rather than more mundane quantum entanglement effects)
- Detection Of the X(6900) Tetraquark
- Detection of X(3872), an exotic charmonium state
- Discovery of the Higgs field
- Solar Neutrinos Detected
- Topological insulators
- Measuring a Moment (is “moment” really the term used, or is this a play on “magnetic moment”?) – shortest measurement of time, how long it takes for a photon to traverse a hydrogen molecule. The final measurement was 247 zeptoseconds or 247 trillionths of a billionth of a second [Disagree : I don’t think this rises to the same level of fundamental important as others on the list. ]
- Detection of the accelerated expansion of the Universe
- Electroweak unification
- Direct detection of gravitational waves
- Discovery of the bottom and top quark.
- Completion of the standard model of particle physics
- Conclusive evidence of neutrino oscillations
- Measurement of the anisotropies in the cosmic microwave background in agreement with the Lambda-CDM model of cosmology
- Rise of multi-messenger astrophysics : SN1987A and GW170817
- Is defined as the coordinated observation of data from several messengers. Light (electromagnetic radiation) is one example of a messenger, and studying light data is how much of astronomy has historically been conducted. Currently there are four messengers for studying extrasolar phenomena: electromagnetic radiation, neutrinos, gravitational waves, and cosmic rays. It is also insightful to examine how data from different messengers compares.
- The whole field of laser technology. Still ongoing and new lasing regimes are invented every couple of years
- [Comment: Can the person who wrote this be a bit more specific on the advances :) → Gladly: The 1990s were dominated by research in ultrastable lasers, used for things like laser cooling and the subsequent implementation of things like Bose-Einstein Condensates. In the 2000s and early 2010s a lot of effort was (and to this day still is) put into the ultrashort and ultrahigh power lasing regimes thanks to the development of laser amplitude mode locking. Ultrashort lasers in turn gave rise to things like the laser frequency comb (Hänsch, Nobel Prize 2005) vastly improving the field of laser spectroscopy, which now is used (among other things) to tackle the Muons anomalous magnetic moment problem (for example by probing muonic hydrogen). The Ultrahigh power regime (enabled by chirped pulse amplification and compression) promises to open another venue to probe the fundamental structure of spacetime by reaching electromagnetic field strengths so strong, to cause spontaneous pair production. (I’m obliged to mention laser ion acceleration, thus also plasma wakefield acceleration, but my personal experiences in that field dampened my expectations, TTBT) Also on the front of mode locking the Fourier Domain Mode Locked laser (Huber et. al, 2006) allows for the creation of wavelength sweeps of light, with a continuous, everywhere differentiable electrical field (ponder on what that means for the photon statistics, fun stuff) over large bandwidths (>100nm) at MHz repetition rates. Biggest technological hurdle is compensating dispersion in the optical media. Interestingly enough, without dispersion compensation the FDML becomes unstable, with the instabilities being Turing instabilities; to my knowledge the FDML is the first pure physics system to exhibit these instabilities https://doi.org/10.1364/OE.21.019240 – from the point of view of practical applications, laser technology is the bedrock of all ultrahigh bandwidth communication infrastructure we use these days (transoceanic and transcontinental cables, internet backbones, etc.). Also without lasers we wouldn’t have superresolution microscopy. Lasers enable new exciting biomedical imaging modalities (OCT, multiphoton, fluorescence lifetime, Raman). The aforementioned FDML, and other swept laser sources, allows to vastly improve on the acquisition time for any imaging modality that’s operating in the spectral domain, by enabling the use of very fast photodetectors instead of slow to read out spectrometers. Then we also have the applications in cell biology, like optical tweezers. Heck there’s a whole field called optogenomics, where the monochromaticity of CW lasers is used to turn on and off by demand genes to which light sensitive molecules have been attached (don’t ask me about the biochemical details though)
- Discovery of exoplanets
- Conclusive evidence of the existence of compact objects aka black holes.
- Weak lensing?
- Quantization methods and passing between classical/quantum systems (geometric/deformation quantization) kiki of Koo i lmk k Oki iï is it look if it Oki iï is it
- Mathematical foundations of classical mechanics (ongoing) through symplectic geometry, variational calculus, jet/PDE theory (this deserves an entire page)
- Advancements in numerical relativity, specifically solving the gravitational two-body inspiral problem in general relativity
- Observation of the Lense-Thirring frame dragging. (Someone who knows better should fix this if needed, iirc the solar system tests of this effect are somewhat controversial but astronomical tests of binaries have detected it consistent with GR)
- Nuclear fusion ? (or does this belong to a separate field)
- The James Webb Space Telescope (JWST)[n] returned spectra of galaxies at z > 13[o], rewriting timelines for star‑formation & reionisation
[Comment: We need some condensed matter physics experts (asking friend now) here. The whole field has been hot (ha ha!) for the last few decade but not much reflected here] graphene, topological states in materials, quantum computation, spintronics
Open questions:
- The nature of a quantum theory of gravity
- Black hole information loss
- Actual habitation on other planets[p]
- The neutron star equation of state, aka properties of nuclear matter at extreme pressure, gravity and temperature.
- Source of neutrino masses.
- Nature of dark matter: Weakly Interacting Massive Particles (WIMPs) have long been a leading candidate, direct detection experiments (like LZ) have significantly constrained the available parameter space for many WIMP models. Recent efforts (2023-2025) are increasingly focusing on lighter dark matter candidates and novel detection strategies. For example, the TESSERACT experiment[q] presented its first results in early 2025, using highly sensitive transition-edge sensors to search for low-mass dark matter (e.g., between 44 MeV/c² and 87 MeV/c²) via nuclear recoils, exploring mass ranges previously unexamined. Future phases of TESSERACT and new experiments like HeRALD (using superfluid helium) and SPICE aim to push sensitivities even lower, down to ~10 MeV/c² by the late 2020s. The hunt for dark matter continues across a broad front, including indirect detection (looking for annihilation or decay products) and accelerator-based searches, with the lack of definitive detection spurring theoretical innovation.
- Nature of dark energy -> cosmic inflation (Einstein’s self-described “biggest mistake”) The standard ΛCDM model incorporates dark energy as a cosmological constant (Λ), but questions about its nature and origin persist. Recent results (2024-2025) from large surveys like the Dark Energy Survey (DES) have provided highly accurate measurements of cosmic expansion history using techniques like Baryon Acoustic Oscillations (BAO) and Type Ia supernovae. Some final DES dataset analyses[r] hint at potential inconsistencies with a simple cosmological constant, suggesting dark energy might evolve over time, though these results are not yet definitive and await further cross-checking with other DES probes (like galaxy clustering and weak lensing) and upcoming surveys. On the experimental front for direct detection or interaction, a notable development in 2025 involved a research team from Nanjing University developing a magnetically levitated precision force measurement system[s]. This system achieved a significant improvement in precision (by six orders of magnitude) for testing theories like the symmetron dark energy model, allowing exploration of previously inaccessible parameter spaces. The nature of dark energy—whether a constant, a dynamic field (quintessence), or something indicative of new laws of physics—remains a key open question.
- The discrepancy between the experimental measurement and the Standard Model (SM) theoretical prediction for the muon's anomalous magnetic moment (g-2)[t] remains a significant point of tension and a potential window into new physics. The Fermilab Muon g-2 experiment's results, including those from 2021 and subsequent analyses of more data (Run 2-3 discussed in 2024), have consistently shown a deviation from some SM predictions. By early 2025, the situation had become more complex: while experimental measurements have reached unprecedented precision (e.g., 460 parts per billion in the 2021 Fermilab result), different theoretical approaches to calculating the SM prediction (particularly the hadronic vacuum polarization contribution) yield conflicting values. Some data-driven theoretical methods suggest a strong discrepancy with experimental results, while some newer lattice QCD calculations show better agreement with experiments, thereby confirming the Standard Model in that aspect. This ongoing 'showdown', expected to intensify with final results and further theoretical refinements in and beyond 2025, keeps the g-2 anomaly a critical area of investigation.
- The 'Hubble Tension[u]' – the discrepancy between measurements of the Hubble constant (H0) from the early universe (e.g., Planck CMB data predicting H0≈67.4±0.5 km/s/Mpc) and measurements from the local, late-time universe (e.g., SH0ES team using Cepheid variables and Type Ia supernovae suggesting H0≈73 km/s/Mpc) – has not only persisted but arguably intensified into a 'Hubble crisis' by early 2025. New data from cosmic microwave background experiments (ACT, SPT, with the Simons Observatory coming online) and large-scale structure surveys (DESI, Euclid space telescope, and the upcoming Vera C. Rubin Observatory/LSST) are expected to be crucial in the coming years. Some 2025 findings, for instance, related to the Coma Cluster appearing closer than predicted by the standard ΛCDM model based on early universe parameters, have further deepened the mystery. While some studies using JWST data in 2024 hinted at resolutions, the broader consensus is that the tension remains robust. Proposed resolutions are varied, ranging from unknown systematic errors in one or both types of measurements to the need for new physics beyond the standard ΛCDM model, such as early dark energy, modifications to gravity, or even more exotic ideas like a slowly rotating universe. The debate is very active, with upcoming data releases highly anticipated.
- Interpretations of quantum mechanics (Copenhagen/Everettian etc) <- (interpretations are subsumed by formalism like quantization, people who talk about interpretations haven’t really met these)
- What exactly is measurement in Quantum Mechanics[v][w][x]
- Why is gravity so much weaker than the other forces (aka the hierarchy problem)
- Fine tuning problem
- Axis of Evil / microwave background anisotropy?
/u/freckledfuck – We still don't know what the fuck gravity is
/u/datenwolf – well strictly speaking we don’t know what spacetime is; the view from the classical side of physics is that gravity is caused by mass-energy distorting (curving) spacetime. From the quantum side of physics the problem is that spacetime as a field would be required to contribute some energy by itself (the zero field), which however if taken naively would instantly cause spacetime to collapse into a singularity. Also if we attribute some energy content to spacetime itself, this would give rise to a corresponding particle (the graviton). However, trying to describe the latter using the quantum formalisms that gave us the standard model doesn’t work. And hence some “other” formalism was required. The most popular alternate theories (string, M, quantum loop) share in common that elementary particles no longer exist as point-like (at scattering events) objects, but are in one way or another spread out (vibrating strings, branes, spin network).
What new breakthrough can we expect in our lifetime? : r/Physics
Methodology
[Comment : physics is so diverse that I am not sure it makes a lot of sense to include a common set of methodology here. What people do in condensed matter or quantum computing is so different from things in astrophysics for example] -> Maybe subdivide Physics?
- (did the LHC actually advance physics? → yes higgs boson, limits on supersymmetry, qcd stuff I'm not familiar with. Remember negative results are important too! Well it did “confirm” the so far unobserved predictions of the good old Standard Model, but AFAIK didn’t give us one yota of a nudge in a direction from which to approach the open questions in theoretical physics.)
Chemistry
Progress
- Advances in analytical chemistry, the sub-field that uses instruments to determine and measure structures, constituent atoms, and concentrations/amounts of chemical species or materials.
- Cryo-TEM; the combination of mass spectrometry (“mass spec,” “MS”) with statistical methods to determine the structure of ever-larger molecules; in-situ and real-time videos of chemical reactions or processes at the near-atomic level
Open questions
See: Open Questions in Chemistry (nature.com)
- “many fundamental questions remain—will we ever get to the bottom of the different structures of ice, or pinpoint the origins of life on Earth, or obtain a full picture of the complexity of carbon-based molecules in space?”
- Ben Schummann and Mia Zol-Hanlon (The Francis Crick Institute, UK) describe the challenges faced by chemical glycobiologists (https://doi.org/10.1038/s42004-020-00337-6). Although all cells and most proteins are decorated with carbohydrates called glycans, our ability to characterise the structure and function of these intriguing molecules is hindered by biology and technology alike.
- Debbie Crans and Kateryna Kostenkova (Colorado State University, USA) outline open questions on the biological roles of the first-row transition metals (https://doi.org/10.1038/s42004-020-00341-w). These elements present sensitive biochemistry, and while most play important roles in biological processes and in medicine, they can become toxic at high concentrations.
- ^ several more
Methodology?
Biology
Progress:
- CRISPR
- Mapping protein-protein interactions
- Understanding protein degradation and how it can be therapeutically modified
- DeepMind/Isomorphic released AlphaFold 3 in 2024, extending accurate structure prediction to protein–nucleic‑acid/ligand complexes.[y]
- Influence of microbiome (especially on immune system) acknowledged and being investigated
- Regulatory (not pathogen related) functions of immune system more generally
- Single Cell RNA sequencing[z]
- Doing GWAS studies
- Not doing GWAS studies
Open Questions:
List of unsolved problems in biology - Wikipedia
Evolution and origins of life (copied from Wiki)
- Origin of life. Exactly how and when did life on Earth originate? Which, if any, of the many hypotheses is correct? What were the metabolic pathways used by the earliest life forms?
- Origins of viruses. Exactly how and when did different groups of viruses originate?
- Extraterrestrial life. Might life which does not originate from planet Earth also have developed on other planets? Might this life be intelligent?
- Evolution of sex. What selective advantages drove the development of sexual reproduction, and how did it develop?[1]
- Development and evolution of the brain. How and why did the brain evolve? What are the molecular determinants of individual brain development?
- Origin of Eukaryotes (Symbiogenesis). How and why did cells combine to form the eukaryotic cell? Did one or more random events lead to the first eukaryotic cells, or can the formation of eukaryotic cells be explained by physical and biological principles? How did the mitochondria's mitosis cycle come in sync with its host cell? Did the mitochondria or the nucleus develop first in eukaryotes?
- Last Universal Common Ancestor. What were the characteristics of the Last Universal Common Ancestor of Archaea, Bacteria, and Eukaryotes? Did Archaea and Eukaryotes evolve out of the domain Bacteria or to clade basal to it? Do Archaea and Eukaryotes share a later or earlier common ancestor to Bacteria?
Progress
- Rise of Evidence-Based Medicine (EBM) and metaresearch standards - generally good practice to question / weigh any clinical research prior to 1990s with more recent literature
- Case in point: meta-analysis methods (statistics) since 1980s, systematic review methodology
- Imaging - Ultrasound, CT/MRI scanning. PET scans and other forms of functional imaging. Optical Coherence Tomography (OCT), Flow Cytometry, In-Vivo n-Photon (n >= 2) Microscopy, In-Vivo Fluorescence Lifetime Imaging (FLIM), Stainfree Histology Imaging with minimal sample preparation
- Most medical imaging progress usually is also a progress in experimental physics and engineering.
- Vaccination programs - Smallpox, polio, Hep B, Covid
- Cardiology - Statins, beta blockers, ace inhibitors
- Meaningful TB and HIV therapies
- IVF
- Birth control (ok, technically since 1960 but... still)
- mRNA vaccines
- Nerve decompressions. The first advancement in this field was the finding that spine pathologies can cause chronic pain in parts of the body far away from the spine but without any functional deficit (e.g. L1 nerve root compression can cause sciatica). While conceptually obvious that nerve compression could occur in more peripheral nerves, until laparoscopic surgery none of these peripheral nerves were surgically accessible since the entrapment points could only be discovered intraoperatively. In the 2000s laparoscopic surgery was first used to decompress the largest peripheral nerve in the body, the sciatic nerve. Also in the 2000s the first cranial nerve decompression was developed and is today the only known treatment that has a cure rate for migraines. This meant that previously incurable and minimally treatable forms of chronic pain could be addressed at the root cause. In 2015 the first paper was published for a laparoscopic decompression of the sacrolumbar plexus, the largest combined nerve plexus in the body.
- endoscopy/laparoscopy/arthroscopy. A minimally invasive way to deliver surgical tools deep in the body as an alternative to open surgery.
- Robotic surgery
- Cancer immunotherapy
- Cancer radiotherapy advances
- Proton beam therapy, etc.
- High Throughput drug screens
Open Questions
- Approaches to widespread antibiotic resistance
- Approaches to widespread obesity, mental health issues
- Fascinating read on this from @mold_time: A Chemical Hunger – Part I: Mysteries
- GLP‑1 receptor agonists (semaglutide, tirzepatide) showed ≥15 % sustained weight‑loss in phase‑3 trials and entered wide clinical use 2022‑25.[ab]
- Psychology/psychiatry are relative unknowns compared to, say, orthopaedics.
- Inflammatory processes
- Alzheimers
- Sleep (causal mechanisms, engineering-practical ways to reduce the need to sleep)
- Ageing
- What is pain / especially chronic pain
Partial answer: Chronic pain is an umbrella term for what are many separate minimally treatable diseases that share a common symptom: pain. In some cases what makes them untreatable is that the diagnostics are poor -- clinical exams and medical imaging can't identify the cause even though in some cases the underlying cause is actually treatable with current technology. (This has been the case with nerve entrapment - poor doctor awareness, poor diagnostics, excellent surgical outcomes). In some cases the diagnostics are ok but the cause isn't understood well enough to intervene. Chronic regional pain syndrome and fibromyalgia would fall into that category.
Methodology
- Randomised controlled trials
- Cohort studies
- Cancer Metabolism - what metabolic changes do cancer cells have in common, and how can we leverage that for better treatment
Note - Overlap with public health and policymaking. Is tobacco control medical progress, or progress of a different field?
- Fundamentally motivated by Epidemiological advances in causal inference, see Bradford-Hill vs Fisher,
Neuroscience[ac]
Progress
- Mapping the connectome[ad]
- Brain-computer interfaces (BCIs)
- 1970s (Conceptualization):
- Jacques Vidal coined the term "Brain-Computer Interface" (1973) and demonstrated first EEG-based control (using VEPs).
- Enabled by the advent of accessible computers for signal analysis.
- Shift from passive EEG observation to active control channel concept.
- 1980s (Early Non-Invasive):
- Refinement of digital EEG and signal processing.
- Development of systems based on controllable EEG signals like P300 evoked potentials (for spellers) and Slow Cortical Potentials (SCPs) (for binary choices).
- Highlighted challenges: slow speed, user training, signal reliability.
- 1990s (Rise of Invasive):
- Exploration of invasive methods (ECoG, Intracortical microelectrodes) for higher signal fidelity.
- Successful decoding of motor intentions (initially in animals, then humans) for controlling cursors/robots.
- Established the fundamental trade-off between performance (invasive) and safety/accessibility (non-invasive).
- 2000s (Human Trials & Neuroprosthetics):
- Breakthroughs in human trials using invasive BCIs.
- Demonstrated volitional control of multi-joint robotic arms by paralyzed individuals.
- Improved communication BCI speeds.
- Development of more sophisticated machine learning decoding algorithms.
- Emergence of bidirectional BCIs (incorporating sensory feedback).
- 2010s-Present (AI Integration & Commercialization):
- Integration of advanced AI/Machine Learning significantly improves decoder performance and robustness.
- Major government funding (e.g., BRAIN Initiative, DARPA projects).
- Significant commercial investment and rise of BCI startups (e.g., Synchron, Paradromics).
- Hardware advances: dry/wireless EEG, miniaturization.
- Broadening applications beyond communication/motor restoration.
Open Questions
Computer Science[ae]
Note: Many foundational concepts pre-date 1970 (like basic ideas of neural networks, garbage collection, virtual memory, etc.). This outline focuses on their significant development, widespread adoption, or major breakthroughs since 1970. There's also natural overlap between categories (e.g., a cryptographic method is both progress and a methodology).
Progress
- Microprocessors & Personal Computers: The transition from mainframes to accessible personal and ubiquitous computing.
- Pipelined & Out-of-Order Execution: Significant CPU performance improvements.
- Virtual Memory: Became standard in operating systems, enabling larger applications.
- Massively Parallel Architectures: Rise of GPUs and specialized hardware (TPUs) for graphics, simulation, and AI.
- End of Moore's Law & Dennard Scaling: Slowdown in traditional scaling, driving innovation in architecture and specialization.
- Machine Virtualization: Enabled cloud computing and efficient server management.
- Networking & The Internet:
- Packet Switching & Routing: Foundations for the modern internet.
- TCP/IP Protocol Suite (v4 & v6): Enabled reliable global communication.
- Email: Early "killer app" driving network adoption.
- World Wide Web: HTTP, HTML, browsers democratized access to information.
- High-Efficiency Media Compression: DCT-based (JPEG, MPEG) and Wavelet-based (JPEG2000) methods enabling digital media distribution.
- Software-Defined Networking (SDN): Increased flexibility and programmability of networks.
- Software Systems & Languages:
- Relational Databases: Dominant model for structured data storage (SQL).
- Object-Oriented Programming: Widespread adoption (influenced by Smalltalk, implemented in C++, Java, etc.).
- Garbage Collection: Became common in mainstream languages (Java, C#, Python), improving developer productivity.
- Just-in-Time (JIT) Compilation: Performance boost for interpreted/bytecode languages (Java/JRE, .NET/CLI, WebAssembly).
- Focus on Memory Safety: Languages like Rust offering compile-time safety guarantees.
- Distributed Systems & Datastores: Development of scalable databases and Distributed Hash Tables (e.g., Kademlia).
- Cloud Computing: Delivery of computing services over the internet.
- Asymmetric (Public Key) Cryptography: Revolutionized secure communication and digital signatures (RSA, Diffie-Hellman).
- Elliptic Curve Cryptography (ECC): More efficient public key crypto.
- Zero-Knowledge Proofs: Enabling privacy-preserving verification.
- Artificial Intelligence & Machine Learning:
- Deep Learning Resurgence: Breakthroughs driven by data, computation (GPUs), and algorithmic refinements (e.g., Back-propagation's effective application).
- Computer Vision: Deep learning dominance (e.g., ImageNet 2012).
- Natural Language Processing: Transformer architecture (2017) leading to models like BERT (2018) and large scale models like GPT-3, showing emergent capabilities.
- Reinforcement Learning Successes: Solving complex games (AlphaGo, MuZero).
- Computer Graphics & Simulation:
- Physically Based Rendering (PBR): Realistic simulation of light-matter interaction (including subsurface scattering).
- Real-time Ray Tracing: Bringing high-fidelity rendering to interactive applications.
- Augmented Reality (AR) & Virtual Reality (VR): Maturing technologies combining graphics, tracking, and interaction.
- Advanced Simulation: Computational Fluid Dynamics (CFD), Soft Body Dynamics, Agent-based modeling (Flocking/Boids).
- Byzantine Fault Tolerance (BFT): Algorithms for reliable distributed systems despite failures/malice.
- Nakamoto Consensus (Blockchain): Enabling decentralized trust (e.g., Bitcoin).
- DAG-based Asynchronous Consensus: Alternative approaches to distributed agreement.
Open Questions
- Theoretical Computer Science:
- P vs NP: Does polynomial-time verification imply polynomial-time solution? A fundamental question with vast implications.
- Existence of Secure One-Way Functions: Is robust public-key cryptography provably secure based on fundamental complexity assumptions?
- Artificial General Intelligence (AGI): Is human-level (or beyond) flexible, general intelligence possible? What are the pathways?
- Interpretability & Explainability: Why do complex models (especially deep neural networks) make specific decisions?
- Continual Learning: How can AI systems learn incrementally and adaptively without forgetting previous knowledge or requiring complete retraining?
- AI Alignment: How do we ensure complex AI systems reliably understand and act according to human values and intentions, especially as they become more autonomous and capable?
- Robustness & Adversarial Attacks: Making AI systems resistant to subtle, malicious manipulations of input data.
- Data Efficiency: Achieving strong performance without requiring massive datasets.
- Fundamental Theory of Intelligence: Developing a comprehensive, measurable definition and theory of intelligence (building on concepts like "skill-acquisition efficiency").
- Beyond Moore's Law: How to continue improving computational performance through architecture, materials, and paradigms (e.g., quantum, neuromorphic).
- Scalable Quantum Computing: Building large-scale, fault-tolerant quantum computers and discovering impactful applications beyond Shor's algorithm.
- Secure and Private Computation: Developing practical methods for computation on encrypted data (homomorphic encryption, secure multi-party computation).
Methodologies
- Algorithm Design & Analysis:
- Randomized Algorithms: Using randomness to achieve efficiency or simplicity.
- Approximation Algorithms: Finding near-optimal solutions for NP-hard problems.
- Distributed Algorithms: Designing protocols for coordination in networked systems (including consensus algorithms).
- MapReduce Paradigm: Framework for processing large datasets in parallel.
- Programming Paradigms & Software Engineering:
- Object-Oriented Design: Encapsulation, inheritance, polymorphism as organizing principles.
- Functional Programming: Emphasis on immutability and pure functions gaining traction.
- Formal Methods: Mathematically rigorous techniques for specifying, developing, and verifying software and hardware systems (e.g., TLA+, Coq, Agda, Model Checking with tools like Alloy).
- Memory Safety Paradigms: Compile-time checks (Rust's ownership/borrowing) vs. runtime checks (Garbage Collection).
- Machine Learning Techniques:
- Back-propagation: Core algorithm for training deep neural networks.
- Key Architectures: Convolutional Neural Networks (CNNs), Recurrent Neural Networks (RNNs), LSTMs, Transformers.
- Ensemble Methods: Combining multiple models (e.g., Random Forests).
- Reinforcement Learning Frameworks: Q-Learning, Policy Gradients, Actor-Critic methods, Model-Based RL (World Models).
- Optimization Techniques: Stochastic Gradient Descent (SGD) and its variants (Adam, RMSprop).
- Regularization Methods: Techniques to prevent overfitting (Dropout, L1/L2 regularization).
- Biologically Inspired Approaches: Evolutionary Algorithms, Spiking Neural Networks (SNNs), concepts drawn from neuroscience.
- Introspection & Visualization: Techniques to understand internal model representations (e.g., feature visualization).
- Finite Element Method (FEM): Widely used in engineering and physics simulations.
- Monte Carlo Methods: Using random sampling for simulation and integration.
- Agent-Based Modeling: Simulating systems based on autonomous agent interactions.
- Number Theory & Algebraic Geometry: Mathematical foundations for public key crypto (including ECC).
- Probabilistic Proof Systems: Techniques underlying Zero-Knowledge Proofs.
- Information & Coding Theory:
- Advanced Error-Correcting Codes (e.g., Turbo Codes, LDPC Codes).
Psychology
Progress
- Mainly “negative” progress — we’ve learned a lot about how things DON’T work. Memory isn’t perfectly reliable, people aren’t rational, etc. This is still important progress, and early progress in any field often looks like this, ruling out general angles and overturning assumptions
- This is a nice way of framing it!
- Big Five model of personality is pretty decent
- HEXACO makes additions to Big Five. Big Five only uses the English language in factorial analysis, whereas HEXACO includes multiple languages whereby a sixth cluster of personality traits emerge. ELI5: It is assumed if personality traits exist, language has evolved to describe them. Using the English language yielded five traits, but when you analyze many languages, the trait of “Honesty/Humility” emerges as its own category[af][ag][ah]. Article link.
- 1972 onwards — Perceptual Control Theory has an actual white box model that generates a bunch of behavior, rather than just being a bunch of statistical correlations
- This article talks about how it’s a paradigm shift
- Kuhn himself commented “Powers’ manuscript, Behavior: The Control of Perception, is among the most exciting I have read in some time. The problems are of vast importance, and not only to psychologists; the achieved synthesis is thoroughly original; and the presentation is often convincing and almost invariably suggestive. I shall be watching with interest what happens to research in the directions to which Powers points.”
- Memory Reconsolidation (See Unlocking the Emotional Brain)
Open Questions
- To a certain extent, everything. Psychology is still pre-paradigmatic
- Thomas Kuhn: “In any case, there are many fields—I shall call them proto-sciences—in which practice does generate testable conclusions but which nonetheless resemble philosophy and the arts rather than the established sciences in their developmental patterns. I think, for example, of fields like chemistry and electricity before the mid-eighteenth century, of the study of heredity and phylogeny before the mid-nineteenth, or of many of the social sciences today. In these fields, too, though they satisfy Sir Karl's demarcation criterion, incessant criticism and continual striving for a fresh start are primary forces, and need to be. No more than in philosophy and the arts, however, do they result in clear-cut progress.”
Methodology
Psychotherapy
Progress
- ELI5: Therapist effects are the influence the particular therapist has on a session, versus any other factor like the type of treatment being used. The effect a particular nurse has on your flu shot would likely be very little, but perhaps not so for psychotherapy. TLDR of the above thesis paper’s citations: 5-10% of outcome is attributable to the therapist in particular.
- The “Psyche Awards” exist. This is an award style editorial that attempts to cull the most important advances in psychotherapy from the literature and present them in one location.Here are the 4th annual: 2021. The same as a .pdf.
- Movement away from debates of which theory is better toward theory integration and common factors (CF).
- The Dodo Bird Verdict proposes that bone-a-fide psychotherapies are indistinguishable from one another in their effectiveness at ameliorating presenting symptoms. Because of this, factors common to all bone-a-fide psychotherapies may be better independent variables to manipulate and study.
- Some specific theories and interventions may be more effective than others at treating specific symptoms.
- J. C. Norcross and M. R. Goldfried are prolific authors on the topic of theory integration, B. E. Wampold for Common Factors.
- “Microskills” have been identified as common among effective therapists and are used as a foundation to train beginning counselors. These include invitational skills (body language), asking questions (more open ended than closed), how to “reflect” feelings, content, and meaning, and changing problems to goals.
- The “Therapeutic Alliance” (e.g. the working relationship between counselor and client which includes the bond, the agreement on goals, and agreement on task: link) has been the strongest predictor (r=0.275; cohens d=0.57) of positive therapeutic outcomes.
- Generating hypothesis, follows the methods of philosophy
- Disagree: This bullet point is unclear. What does this mean in terms of progress? What are the “methods of philosophy” and how are they used to generate a hypothesis? Certainly this is not progress made in the last 50 years.
Open Questions
- Are there actually specific factors that drive psychotherapeutic progress?
- Since the Rosenzweig seminal paper of 1936, there has been a debate over specific versus common factors as the “active ingredient” of psychotherapy. In other words: Is it CBT, DBT, Narrative Therapy that specifically makes the difference, or factors common to all therapy that help clients (like setting goals, sensing a “good relationship” with the therapist, etc…). If This article (2019) is not the most up-to-date summary of the topic, please update the conversation.
- What are the psychotherapeutic “Mechanisms of Change?”
- This seemingly basic question has not yet been answered. Outcome research using randomized control trials (RCTs) show psychotherapy does make curative change when compared to placebo, but the question of how therapy makes curative change is yet unanswered. This article (2007) is cited over 2,000 times and may be the most up to date summary on the topic.
- What is Intelligence and how to measure it (psychometry)
- Current standards IQ, are the best what we’ve got but still don’t measure intelligence per se - maybe aspects of it
Methodology
- The field of psychotherapy has attempted to lash itself to “The Medical Model,” for good or ill. This means attempting to study and treat psychotherapy as if it were illness -> symptom -> diagnosis -> treatment.
- The field is only around 100 years old and is experiencing growing pains when it comes to attempting to utilize a scientific methodology.
- Randomized Control Trials (RCTs) have been criticized (2004) as a best practice for psychotherapeutic outcome research.[aj]
- You can find three responses to this Westen et al. article and the original author responding to those: worth a skim of all five if psychotherapy research is your jam. The expert community supports this critique based on those responses.
- Researcher Allegiance (RA) has been shown (2013) to have a significant impact on psychotherapeutic research with a moderate effect size. This threatens to invalidate many studies (as it should).
- n=very smol, very often
- A 1963 study on experimenter bias had researchers measure the outcomes of “smart” rats versus “dumb” rats. The experimenters did not know there was no difference between the groups of rats. The outcomes suggest that experimenter expectations affect the experiment; the “smart” rats performed significantly better. This has obvious implications for Researcher Allegiance in psychotherapy.
- Component Studies are used to single out therapeutic ingredients. These are done by adding one component to therapy and comparing it to the same therapy without, or by subtracting one component. P. 217 of this article (2019) has a blurb.
Economics
Good ELI5 Resource: AEA’s Journal of Economics Perspectives, Good Economics for Hard Times (Abhijit Banerjee, Esther Duflo) and Economics for the Common Good (Jean Tirole)
Use IGM’s polls to see evolution of consensus (for eg - min wage consensus has largely changed (Image))
→ [Nobel Prize awardees]
Progress
- Money printer go brr
- Anything from Robin Hanson (Anything from Robin Hanson? Really? If your 5 year old asked you about the progress in economics, your plan is redirecting them to the assembled works of Robin Hanson?)
- To the moon!
- Quantitative Easing as standard monetary policy.
- A Central Bank buys government bonds by printing more cash, increasing liquidity for the government in the short-term with the goal of stimulating economic activity. Increases money supply, which inflates the currency.
- Philips Curve
- Money Multiplier
- DSGE
- Special Drawing Rights (SDR)
- Adopted by the International Monetary Fund (IMF) in 1969.
- Used as a unit of account and allocated to central banks of countries, in amounts determined by people at the IMF.
- Value is determined by weighing 5 key international currencies.
- U.S. dollar (41.73%), euro (30.93%), renminbi (Chinese yuan) (10.92%), Japanese yen (8.33%), British pound (8.09%).
- Former Soviet countries’ transition to market-based economies
- Poland in 1990, Balcerowicz Plan designed by Jeffrey Sachs.
- Extremely fast-paced changes (currency conversion, removing subsidies to enterprises, removing restrictions on trade).
- The influence that the US government had on the transition plans of various countries (compare Russia and Poland).
- The European Union/Euro and its effect on the unit of account for global trade (vs the US dollar)
- Oil, for example.
- Also its effect on the countries that use the Euro - lack of sovereignty over their own monetary policy. See Greece’s debt bondage to the IMF.
- Financial Derivatives going mainstream and speculation on just about anything.
- Asymmetric information leading to inefficient outcomes - Ackerlof, Stilgitz ect
- Role of asymmetric information in insurance markets
- Role in discrimination in competitive markets
- Market for lemons
- Inflation targeting. Central banks target a rate of growth in the price level to anchor expectations of inflation. If central banks say that they’ll target a 2% inflation rate and if people believe them, they’ll plan their lives around that target. This leads to a self fulfilling prophecy where the existence of the target leads to inflation being around the target.
- While inflation targeting was first proposed by Keynes in 1923, it was first done by the Reserve Bank of New Zealand in 1989, Canada in 1991, the UK in 1992, the US in 2012 (and a list too long to mention).
- Has reduced both the rate and volatility of inflation in countries that have adopted it.
- Bitcoin (don’t want to open pandora’s box but it is important)
- (as opposed to) money printed/minted by centralized authorities
- Ties into the open question of “what is money in the first place?”
- Seems definitely true (maybe stick to actual academic developments rather than very online overconfident takes) that in retrospect BTC, ETH will be considered significant… not including them in a section about economics (or finance?) would be like leaving the Internet out of a section about media
- Related is the idea of quadratic funding/voting -- I think it is too early to determine if it (now) will have been a big thing (a decade from now)
- Would be better to look at mechanism design + social choice more generally than narrowly focusing on algorithmic mechanism design and algorithmic social choice
- Prediction markets (maybe?)
- Co-operative game theory
- Urban economics
- Micro-founded economies of agglomeration - matching, knowledge, amenities
- Solow-Swan + empirical validation by Mankwi et al
- Romer
- Aghion/schumpeterian growth models
- New institutional economics
- Acemoglu + Robinson
- Advance both in development econ + econ history
- Increasing popularity of empirical work:
- Chetty et al are the standard bearers
- Esther Duflo & Abhijit Banerjee
- Distinction between demand and effective demand (what people can actually pay for and also know that they would be willing to pay for)
- Maybe not as recent as the past 50 years, Keynes coined the effective demand phrase 1936 with “general theory …”
- Counterpoint - this is an introductory concept, does not seem to be relevant for an of ‘progress’
- Complexity Economics
- Minimum Wage
- Refer to traditional models; Friedman primarily
- Refer to modern work by Alan Krueger, David Card, Arin Dube to find analysis of Min Wage and effects on employment in context of monopsony power (possibly cover that this was predicted by traditional models by Friedman but ignored as merely a theoretical possibility(?))
- Economic Growth - Our World in Data
- RCT work and stuff; Refer to Poor Economics for good ELI5 summaries
- Poverty Traps
- Convergence, Divergence
- Industrial Policy (evolution from Friedman to Dani Rodrik)
- Amartya Sen
- Prospect Theory
- Nudge theory, with all its discontents
- Efficient Market Hypothesis
Open Questions
- Is economics even a science? Social science?
- Can people create and test hypotheses about how
- money/value/interest/trade/... works? (Yes.)
- Are those tests replicable? (it’s complicated)
- What is the transmission mechanism for monetary policy and inflation?
- What is money in the first place? Lots of ad-hoc assumptions, and descriptions of what can be done with money, how it’s exchanged. But still no axiomatic definition of monetary value.
- Token or battery-analog for exchanging present value/work/resource for future (expected) value/work/resources
- → barter model doesn’t satisfactorily account for devaluation by inflation.
- Entropic contributions of transactions and the exchange of goods and services.
- Ergodicity economics - Wikipedia
- Why are poor countries poor
- How does idea generation work
- Existence of arbitrage in cross-country interest rates (this is smaller)
- What empirically drives economies of algormoration
- Why was the industrial revolution started in Britain and Belgium
Methodology
- Looking at charts from U.S. Bureau of Labor Statistics
- LINEAR REGRESSION BABY
- Positive vs normative statements
- Empirical studies:
- These studies are typically done in a lab using volunteer college students. They almost always offer the chance of a real reward or payout.
- Experimental economics researchers are not permitted to lie or mislead subjects, unlike many studies in psychology. (Something like the Asch Conformity Experiment would not be publishable.) Economists don't want to poison the well and have subjects behave differently because they suspect something fishy is happening. Of course, subjects are not frequently told what the study is really about. But they always get an honest explanation of the tasks they'll be doing and the rewards for doing so, without fear of researcher plants messing with things.
- Straightforward replications are uncommon. To get a replication published, it usually has to be included with a variant of the original study. Some experimental economics studies may suffer the same replication issues as so many social psychology papers.
- Because they allow randomized controls, experimental studies can provide the best statistical evidence for some phenomena. But they have very limited scope. It's not clear that a demonstrated behavioral quirk of US college students would generalize to a farmer in India facing more significant gains or losses.
- Statistics applied to economics is called Econometrics, and Econometricians have contributed to statistical methods for separating correlation from causation (called "identification strategies").
- Because economists are so interested in identification strategies, OLS regression papers are not in style, unless a novel dataset has been found or the paper also contributes significant theoretical work.
- Common simple identification strategies in Econometrics include Differences-in-Differences, Regression Discontinuity, and Instrumental Variables.
- Papers that don’t try to address the correlation/causation problem in some way are not common in top journals.
- While identification strategies can be powerful in theory, they almost always require strong assumptions that are very contestable. Combined with the fact that economic data is very noisy, this makes many statistical claims quite weak. Worse yet, they can allow even more forking paths for researchers to hunt for statistical significance.[ak][al][am]
- These papers typically involve the development of a mathematical model. It may take the form of a game-theoretic model, a general equilibrium model, etc., and involve calculus or higher math.
- They frequently make use of utility functions and include large simplifying assumptions. They are often criticized on these grounds[an].
- Most of these papers are utterly inscrutable to laymen. Most people won't have to judge whether or not they can trust the paper, because they won't be able to read the paper.
- Many of the most widely cited papers are of this type, because the best of them can launch a new subfield or research program. See, for example, New Trade Theory.
History
Progress
- Approaches to treating the study of history as a science have changed dramatically since the 1970s. Following a major decline in totalizing grand narrative approaches to history that fell out of practice in the 70s, 80s and 90s, we've since seen a rise of popular grand narrative approaches to human history. Many are interdisciplinary, relying as much on evolutionary anthropology as much as classic historical methods.
- The Internet has fed into an explosion in consumption of 'popular' history. Podcasts, audiobooks, amateur YouTube channels, audiences for academics, larger market for pop history books etc.
- There has been an influx of broader, diversified methods & approaches to human history, including emerging regional specific historical studies that had not received formal treatment prior to the turn in the 70s and after towards non Western countries, and cultural approaches to historical study. See: Postcolonial Studies, Feminist scholarship, etc. Whatever you think of the merit of the topics, it is undeniable that the landscape of study has changed. there is now a much wider range of scholars studying subjects no one had studied before and asking these new questions has dramatically changed the art and science of history.
- The rise of history of science and technology studies (STS) and how that field has reshaped philosophical approaches to scientific practice and thought.
- Arguably, the self reflexive approach to improving scientific thought through rigorous study and reflection of how science arose historically and practically, is one of the best ways to produce better scientists and better societies that produce better scientists.
- STS has also shed light on how the process of science is affected by broader political and social realities, and how this can shape the kind of scientific questions that scientists ask. See, famously: Thomas Kuhn, 'The Structure of Scientific Revolutions'
Open question
- To what extent is history truly 'scientific'?
- The difference between empirical and scientific approaches. It's worth recognizing that contemporary conceptions/definitions of the strict scientific process are connected with one (1) particular perspective of what science is (Karl Popper and positivism).
- Keeping in mind that science arose *historically* as an outgrowth of philosophy helps us keep in mind that the philosophy of science and the scientific method aren't just taken for granted, natural realities. There are arguments (and disagreements) about how science operates that change how we approach science even at the level of direct practice.
- Does history need to be more thoroughly scientific? Is this possible or even desirable?
Progress
- Hayao Miyazaki back outta retirement baby
- Promare
- Kanye is an Akira fan
- Making Anime Faces With StyleGAN (https://www.gwern.net/Faces)
Open questions:
- What’s the correct order of watching Fate series?
- Will HxH ever finish?
- Was the ending of Evangelion good
- Trick question, of course: which one of the three (and counting)?
- Taking bets on when will Hideko Anno make a fourth one
- What will it take for dubbed anime to be accepted? Will it ever?
Methodology:
- Ah, i see you are a man of culture as well
- Arguing on reddit and /a/
- Posting anime memes on the broader internet
- Being a weeb 24/7
Philosophy of Science[aq]
Open questions
- Is this really a branch of science?? “Critical theory situates science within the quest for social and political rationality.”
- Popper suggested Einstein's theory of gravitation as an example of a theory that clearly satisfied the criterion of falsifiability. In contrast, he proposed that the two psycho-analytic theories of Freud and Adler were examples of theories that were non-testable and irrefutable (p.37) See: What would Karl Popper say? Are current psychological theories of ADHD falsifiable?
- What is the right balance between theory and practice? .ie Transcendental and Empirical Difference.
- What is the cause of social problems? “social problems stem more from social structures and cultural assumptions than from individuals”
- To what extent is scientific truth possible?
- → Nature is the ultimate jury, judge and executioner.
- See Naturalistic Fallacy
- ←nope, this argues not for “natural = better”, but that’s not what this is about; this is more about Popper’s falsification approach. If the science isn’t sound, whatever is built on top of it, eventually will be failed by nature.
- Thought experiment: The drowned child. Who’s to blame? How can we prevent drowning? Should we prevent drowning?
- ← what in the heck is this false analogy informing anything about *science*? Ethics maybe.
- Seems like an attempt to bring in a famous Effective Altruism thought exercise from Peter Singer? But of questionable relevance.
- Still missing the point. The question stated is, if there is such a thing as “scientific truth”. Most STEM scientists would say, “we can be sure of scientific falsehoods (namely when empirical data disagrees with hypotheses and/or theory), but scientific truths are closed in to asymptotically, but never fully reached.”
- To what extent are truth claims shaped by dominant ideologies?
- How can we philosophically ground our sense that things could be better in the world without falling into sloppy bias and inertia??
- Is the so-called “Great Enrichment” legit? Modernity seems great on a statistical level but “people” talk a lot about alienation and modernity sucking. How to understand this?
- Voluntarism and determinism
- Idealism and materialism
- What comes first, the chicken or the egg? (The egg) (That’s Evolutionary biology tho)
Methodology
Stanford Encyclopedia of Philosophy, entry on Critical Theory
Critical Theory (Frankfurt School)
Critical Theory: Epistemological Content and Method
Critical Theory Paradigms
Critical Theory and Methodology
https://en.wikipedia.org/wiki/List_of_unexplained_sounds#Julia
[a]Point of clarification: In 2021, it was impossible to know that the platform known as Twitter would eventually be rebranded as X.
[b]Point of clarification: In 2021, it was impossible to know that the platform known as Twitter would eventually be rebranded as X.
[c]I've used Google Gemini extensively to revamp layouts & content in this project. Assume all of my edits are Gemini-generated. It's pretty good as of May 2025. That being said, I'm not an expert in any scientific field. It would be nice to have an expert review my edits for accuracy.
[d]Made an executive decision to include everything after 1970 to make it more clean. Yes, it's not technically 50 years, but "Scientific advances since 1970" anchors the discussion and provides a clean cutoff point, rather than debating the beginning year, as the years roll by.
[e]Should we migrate this to Github?
[f]https://mathigon.org/timeline/wiles
https://www.ox.ac.uk/news/2016-03-15-fermats-last-theorem-proof-secures-mathematics-top-prize-sir-andrew-wiles
https://staff.fnwi.uva.nl/a.l.kret/Galoistheorie/wiles.pdf
https://artscimedia.case.edu/wp-content/uploads/2013/07/14182623/Mazur-Gadfly.pdf
[g]https://ee.stanford.edu/~hellman/publications/24.pdf
[h]https://www.preprints.org/manuscript/202412.2223/v1
https://pmc.ncbi.nlm.nih.gov/articles/PMC6561272/
https://aimath.org/~kaur/publications/90.pdf
https://www.sciencenews.org/article/mathematicians-progress-riemann-hypothesis-proof
https://mathscholar.org/2019/05/mathematicians-prove-result-tied-to-the-riemann-hypothesis/
https://www.futurity.org/riemann-hypothesis-mathematics-2073592-2/
[i]Does anyone have a link? I found this, but I'm not sure if this is the intended "single page" http://www.zhengwenjie.net/huanghao/
[j]Maybe it is relevant to separate Pure Math and Applied Math sections unless the breakthrough has contributed to both pure and applied math...
[k]Source: https://www.claymath.org/millennium-problems/poincar%C3%A9-conjecture
https://cseweb.ucsd.edu/~alchern/projects/ShapeFromMetric/
[l]https://www.computer.org/csdl/journal/tp/2025/02/10740308/21uJ6a4MC3K
https://www.researchgate.net/publication/385431468_Recent_Advances_in_Optimal_Transport_for_Machine_Learning
https://www.activeloop.ai/resources/glossary/optimal-transport/
https://neurips.cc/virtual/2021/workshop/21833
https://jj-zhu.github.io/file/OT-DOM2024_BoA.pdf
https://arxiv.org/abs/2306.16156
[m]Sources: https://www.youtube.com/watch?v=Dp-mQ3HxgDE
https://xenaproject.wordpress.com/
[n]https://science.nasa.gov/mission/webb/
[o]https://www.nasa.gov/universe/nasas-webb-draws-back-curtain-on-universes-early-galaxies/?utm_source=chatgpt.com
[q]https://news.fsu.edu/news/science-technology/2025/03/20/fsu-researchers-part-of-tesseracts-hunt-for-dark-matter/
https://newscenter.lbl.gov/2025/03/05/small-but-mighty-tesseract-joins-the-hunt-for-dark-matter/
[r]https://noirlab.edu/public/news/noirlab2513/
[s]https://scitechdaily.com/gravity-defying-breakthrough-floating-sensor-unmasks-dark-energys-secrets/
[t]https://physicsworld.com/a/the-muons-magnetic-moment-exposes-a-huge-hole-in-the-standard-model-unless-it-doesnt/
https://agenda.infn.it/event/35353/contributions/239268/attachments/125774/185763/MS_240924_gm2_RICAP24-compresso.pdf
https://link.aps.org/doi/10.1103/Physics.17.6
https://www.azom.com/news.aspx?newsID=64059
[u]https://cerncourier.com/a/the-hubble-tension/
https://www.space.com/space-exploration/hubble-space-telescope/hubble-tension-is-now-in-our-cosmic-backyard-sending-cosmology-into-crisis
https://timesofindia.indiatimes.com/science/is-the-universe-rotating-this-new-groundbreaking-theory-could-help-resolve-the-hubble-tension/articleshow/120389469.cms
https://www.scientificamerican.com/article/the-hubble-tension-is-becoming-a-hubble-crisis/
[v]possibly a duplicate of the one above
[y]https://www.nature.com/articles/s41586-024-07487-w?utm_source=chatgpt.com
[z]Requesting you to recommend a video explaining the same
[aa]Should medicine be split into subfields? https://www.sgu.edu/blog/medical/ultimate-list-of-medical-specialties/
[ab]https://www.fda.gov/news-events/press-announcements/fda-approves-new-drug-treatment-chronic-weight-management-first-2014?utm_source=chatgpt.com
[ac]Cut a lot of content from this subhead, which read like a Wikipedia article, and had a lot of questions. This is great, but a lot of the content in the neuroscience section is covered in the linked WIkipedia article. Because it looks like someone did a lot of work with links, I've pasted the cut content here: https://docs.google.com/document/d/11wSzQgatyJ6JwZtzZ5r8OoVYyoL43yt1HePJUSVuWZY/edit?usp=sharing
[ad]https://www.nature.com/immersive/d42859-024-00053-4/index.html
https://www.nature.com/articles/s41586-024-07558-y
[ae]***Cut commentary at the beginning. Pasted below for posterity.***
Since computer science as a field is only ~70 years old, perhaps we should focus on really important advances or more recent advances? Without a tighter constraint than ~50 years, you could list almost all advances barring the earliest ones.
→ Well, okay, then you’ll have to cut Virtual Memory and Virtual Machines from the list (70-ies stuff, albeit, their use in the mainstream became popular only in the late 1990ies (virtual memory) and mid 2000s (hypervisors). And stuff like Pipelining and OOE, while becoming mainstream only in mid 1990-ies (Pentium) also date back to the 1970ies. Heck Register Renaming goes even back to the 60ies (IIRC)
Should this include the broader concept of Software Engineering as well?
Parts of this are very close to maths (complexity reduction proofs, computability, here or maths?)
[af]wow! pls share any good sources
[ag]"The H Factor of Personality" by Kibeom Lee and Michael C. Ashton is a great book on the topic
[ah]I couldn't find an easily accessible link to the article so I uploaded my copy to drive, now linked at the end. This is some of the initial research (2004) the above mentioned book (2012) is based on; same authors.
[ai]Really like some of this paper: Section 7.4 especially.
[aj]Similar issues in medicine for new treatments where treatments exist. => Emergence of method-comparison studies https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2944826/
[al]i don't think this claim holds up very well to scrutiny. *some* identification strategies (such as IV) have strong assumptions, which can still be reasonable under the right set-up. would agree that measurement error is sometimes paid less than desirable attention to, but would push against the data of a concrete "economic data" category. economists have a lot of distinct interests and use a lot of distinct data sources
[am]more generally, this entire section is poor quality info on Economics rip
[an]Utility theory seems very well grounded to me in that most normal human preferences can be described with utility functions. The things are can't are those which violate things like transertiverty and that one where an agent has a difference preference ordering of options in a strict subset of the original set. Large simplifying assumptions definitely though.
[ao]Love the fact that there's a whole section for anime too
[ap]https://x.com/juliagalef/status/1466176860480368643
[aq]***Removed this section because of an increasing emphasis on the philosophy of science as this topic developed. I've preserved the below header for posterity.***
Critical Theory
Meta: a section on social sciences needs to be elaborated first before critical theory will ever make sense in this collection.
I'd like to see a general section "Philosophy", with a subsection "Postmodernism" and "Critical Theory" being placed in a subsection of those. There’ll also be a need for a philosophical subsection “linguistic turn”. Also we’re going to need a whole section “linguistics” somewhere between cognitive sciences and math. Can we get some people working on the field of linguogenesis here? I always felt that the Sapir-Worf hypothesis flies in the face of most approaches to tackle linguogenesis, so collecting some material on that, here, would be great – /u/datenwolf
Progress
Critical Theory desperately needs an ELI5 breakdown because too many people have hugely mistaken assumptions about what it is or isn't.
The most important matter to resolve is: what is science fundamentally, and are there domains of reality where the scientific process differs?
There's a reason why economics is murkier than physics. A full fledged answer to why gets at the point of the differences between social sciences and natural sciences, and opens the door to cultural studies and humanities.
The End of Progress: Decolonizing the Normative Foundations of Critical Theory: The End of Progress | Columbia University Press
There might be something interesting here about progress, i.e. by deeply exploring what we actually mean about what progress is (and who gets to decide that on a social level) we’ve learned a lot.
“Critical theory, according to Allen, is the best resource we have for achieving emancipatory social goals. In reimagining a decolonized critical theory after the end of progress, she rescues it from oblivion and gives it a future.”