Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>Step 3. Memorize the exact wording of the definition.

Huh. Any mathematicians who want share their own opinions and experiences about this?

This pretty much goes completely against my experience with other grad school level neuroscience/ML

You don't want to be so familiar with stuff as to make it second nature but NOT from memorization. That, at least an other areas, leads to surface level recognition

Does the author mean internalize and not memorize?



The author really does mean memorize. To engage with pure mathematics, you must know the definitions, since the definitions are the bedrock of the subject. If you don't know the axioms of a topology, how can you check for yourself whether something forms a topological space? Or without knowing the exact definition of continuous, how can you know whether a proof of continuity is correct? Without knowing the definitions, you can't really know mathematics.

To be clear, this does not mean memorizing all the theorems. Getting to know the theorems (and solving problems) is what helps you internalize the subject. Math is the art of what's certain, and knowing exactly what the objects of the subject are is necessary for that. Theorems are derived from the definitions, but definitions can't be derived.

In my experience with a math (undergrad and PhD), I realized I had to know definitions to feel competent at all. In my teaching, it's hard to convince students to actually memorize any definitions — so many times students carry around misconceptions (like that "linearly independent" just means that no vector is a scale multiple of any other vector), but if they just had it memorized, they might realize that the misconception doesn't hold up. Math is weird in that the definitions are actually the exact truth (by definition! tautologically so), so it does take some time to get used to the fact that they're essential.


> the definitions are actually the exact truth (by definition! tautologically so)

It’s easy to forget that non-math people find this — the idea that the definition is its own ‘model’ rather than an approximation of something more ‘real’ — somewhat hard to stomach. Outside of pure mathematics the idea is that mathematics is a tool for (usually lossy) modelling of reality, not a collection of already perfectly well-motivated objects to be studied in their own right.


More generally and symmetrically:

When you are studying science and technology, and the math theorem doesn't match experiment, the theory is probably wrong (or incomplete, missing factors), so you can discard it or try to improve it.

When you are studying math, and the intuition doesn't match the theorem, the intuition is probably wrong (or incomplete).


Things get a bit messier once you're doing research mathematics — definitions don't just come from nothing, and a good definition is one that serves its theorems. Definitions can be "wrong" (they might be generalizable, they might have unexpected pathological examples, etc.), and it's the result of lots of hard work by lots of mathematicians throughout history that we have the definitions we enjoy the use of today.

But yeah, while studying math, I think it's similar to learning programming — don't blame the compiler for your mistakes, it's a well-tested piece of software.


Yes. It’s slightly disingenuous of me to suggest that any definition is as valid and ‘real’ as any other. Obviously, mathematicians care about some ideas more than others.


Yeah, during department teas you can hear mutters of "interesting" as ideas are exchanged and evaluated.

But, in my last comment I was just trying to temper my previous comment's claim about how important definitions are. At some point you get so used to a definition that even if you don't know a particular formulation word for word, you could still write a textbook on the subject because you know how the theory is supposed to go.


For what little it’s worth, the thing that finally made it click for me was a series of comments on HN that were discussing musical scales.

I don’t have any musical training, but I related it back to the practice and warm up sessions we had before we’d play an actual game in the sports I played as a kid.

Perhaps some explanation like that will get it to click with someone.

I also learned of the existence of soft question tags on Math Overflow and Math Stack Exchange that contained an incredible amount of guidance that I think was never possible in lectures. Sharing links to those websites in the syllabus may be helpful for the odd student that actually looks at the syllabus.


I'm teaching discrete math in January — I'll try the analogy, wish me luck!

As someone who's gone through the mathematical ringer, the analogy doesn't ring true to me, but it does sound pedagogically useful still (my students will be CS majors, so the math will be for training rather than an end). Even at the highest levels the definitions are of prime importance, though I suppose once you get to "stage 3" in Terry Tao's classification (see elsewhere in the thread) definitions can start to feel inevitable, since you know what the theory is about, and the definitions need to be what they are to support the theory.

Personal aside: In my own math research, something that's really slowed me down was feeling like I needed everything to feel inevitable. It always bugged me reading papers that gave definitions where I'm wondering "why this definition, why not something else", but the paper never really answers it. Now I'm wondering if my standards have just been too high, and incremental progress means being OK with unsatisfactory definitions... After all, it's what the authors managed to discover.


Yeah, sorry if I wasn't clear. I 100% agree with definitions, theorems, counterexamples, and proof techniques being incredibly important. Those are the "warm ups" or "scales" or things that need to be repeatedly drilled in my mind before trying to jump into the "game," which, to me, is solving problems.


The reasoning is that, similar to memorization of times tables, being able to recall a definition or a theorem and its context / assumptions "automatically" without needing to use your brain frees you to worry about higher order activities. Being able to apply a theorem over and over again ultimately builds mastery and internalization.

Counterintuitively, mathematicians like being "brain-off" as much as possible -- you want to be able to read a phrase like "closed convex subset of a Hilbert space" and effortlessly think to yourself "oh! there's a unique norm minimizer" -- if you have to piece that together from scratch every time, you're going to have a hard time -- reading papers and learning new fields becomes a dreadful slog, similar to how math in general becomes a slog for kids who don't memorize their times tables.


I think they mean memorize. Most students simply lack the mathematical experience to internalize a definition correctly without simply memorizing it. They will forget an "only if", or accidentally swap two quantifiers around, or conflate two variables that need to be kept separate, etc.

This is okay! They're students on the way to gain that experience. At some point you can and will go over to internalizing, instead. But as advice to students just starting out, memorizing is the way to go.


The author also said to internalise it, but the simple fact is that in a proof-y math test you can't just say "well it should converge if everything is well behaved", the edge cases matter and definitions set the boundaries.


Maybe not the exact wording but the exact meaning...


Mathematical definitions are precise. If you miss one part of a definition, then you cannot actually understand what the definition actually means.

You have to internalize the meaning also, but you must know the definition precisely.


It is also useful to know all alternative definitions, if any exist (they often do).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: