This is false: e.g. consider a triangle with side lengths (1,1,1). However, you could say that Pythagorean philosophy was into numbers and the relation between numbers and geometrical forms, harmony, and the nature of the universe, so it would be natural to prove a, well, Pythagorean theorem. As for Pythagorean triples themselves such triangles had been known for millennia.
OK, but maybe the person in the next office over solves seven of them before breakfast every day? What about models of the COVID-19 pandemic? You can’t generalize and say differential equations are useless even if that is not what you work with. The way I interpret the OP, it would take some search through the literature to prove that something is “useless”, and of course even then there will be some niche cases, or even unexpected practical uses. I suppose there could be a bang for the buck measure where you don’t need to teach something extremely specialized to, say, high-school students or first-year university students, but I don’t think differential equations per se will ever be such an example. Some of the subtopics and sub-sub-topics within DEs, perhaps.
Well, it would make sense if it simplifies the issue solving the problem but the way she did it meant drawing out an elaborte diagram for every single problem, drawing a series of arrows, and writing everything out twice, and if you mucked up any part of it the problem was completely wrong. It took a simple two step operation and turned into a five minute ordeal of having to do everything just so. And I spent my spare time calculating orbital elements and trying to optimize spaceship designs in Traveller so it wasn’t as if I had difficulty in following complex operations but this seemed entirely arbitary and a complete waste of time. I was finally sent to a tutor who also couldn’t figure out what the point of all of this was but who advised me to just “go along” to get a passing grade.
It so soured me on math that I nearly gave up and only doubled up on Advanced Algebra and Geometry the next year so I could take Physics my junior year, and was fortunate that I had the same (excellent) teacher for those classes who encouraged and challenged me to do things in different ways, and I aced those classes. Years later, when someone introducted me to tabular integration by parts I had a flashback ot this (literally) and resisted learning it by rote until I actually understood it. And don’t get me started on how long division is taught which is so ridiculously backward that I don’t know why it remains so codified in math education.
Are you kidding? Any time varying phenomenon is almost certainly best represented as a differential equation, and of course you can use the Laplace transform to convert integral and differential functions into mostly algebraic or simplified integral functions which is the fundamental basis of classical controls; similarly, any periodic or ergodic function can be decomposed or represented via Fourier transform into the frequency domain. Of course, if you aren’t doing heat transfer, thermofluid dynamics, or dynamic structural analysis you probably aren’t directly using differential equations, and even if you are doing complicated analysis of real world systems you are probably using some numerical solver that discretizes the problem rather than truing to formulate a closed form solution and solve it in the abstract but it is still important to understand the fundamental basis of such methods ao that you understand the limits of accuracy and fidelity of these methods. I guess if you are doing civil engineering or static mechanical design you don’t really need this knowledge, and if you are a systems engineer your most important ‘skills’ are PowerPoint and Project, but calculus and differential equations underlay all of modern engineering.
That’s not a useless concept, though: That’s just poor teaching of a useful concept.
And in practice, differential equations have one very major use: They act as a filter for engineers and other STEM majors. A very high percentage of students who start off in a STEM major don’t have what it takes to see it through to completion, and will eventually drop out and switch to other majors. By giving them a difficult class in their very first year, you make sure those students drop out earlier rather than later, so you waste less resources on them.
That said, the material itself is also practical. Sure, you might not actually use the techniques much: In practice, you can usually either solve differential equations in Mathematica or the like, or you’ll be solving them numerically if Mathematica can’t handle them (and Numerical Methods is another, later class). But it takes about one course to really drive it into students’ heads just what a differential equation is, where they show up, and that they can be solved.
Is “FOIL” a useful concept, though? It seems like a rote method for doing a special case of a general operation in algebra instead of learning how to apply the general principle of the distributive property. I can understand that most students don’t actually see an fundamental value in algebra because they will no go on to use it in any other area of their personal or professional life, but if you are undercutting the essential principles just to get them to solve problems then they aren’t really learning anything which just exacerbates the complaint of its essential uselessness. It’s like studying Shakespeare by only reading Sparknotes version and then complaining that it is so clichéd and predictable.
It wasn’t intended to be a personal dig at you but rather the idea that differential equations is a useless math concept just because you don’t use it. I don’t use the Krebs cycle in my work or elsewhere in my life but I recognize that it is very fundamental to understanding metabolism in organisms.
Depends what you mean by it. What you were describing earlier didn’t match anything I’d ever seen described as “the FOIL method.”
If you mean “Is it useful to be able to multply to binomials together by multiplying their first, outer, inner, and last terms?” then I’d say yes, absolutely: it’s a basic, foundational skill for any kind of math that involves algebraic manipulations. More generally, it’s important to understand the distributive law and how to use it; and this is one fairly common application of that.
If you mean “Is FOIL a useful mnemonic device for helping people remember how to do this?” then YMMV. Like any mnemonic device, it’s useful for the people who find it helpful.
Many of these examples point to a problem in math education: teachers often don’t know enough about real-world applications of what they’re teaching. I’m a retired engineer, and I used to volunteer at a local middle school, giving briefings on practical uses of the math subjects the kids were learning. The teachers said it was very helpful.
Honestly, I had never heard of FOIL before reading this thread and I had to google it.
The world runs on differential equations, especially PDEs.
Determinants are, AFAIK, the only way of getting to eigenvalues. And Cramer’s rule has some theoretical value, but no practical value at all. It is pretty useless.
The question about the origin of the Pythagorean theorem is interesting and brings up a question I have long wondered about. People knew a long time ago that a 3,4,5 triangle looked like a right triangle. Did it ever occur to them to wonder if it was exactly right? Or did they even have an understanding of the difference between exact and just nearly so? Once the Pythagoreans stated and proved the theorem, they knew it was exact.
To explain this, you probably all know there can be no integer sided isosceles right triangle. But the central angle in a 29,29,41 triangle is 90,034 degree and I doubt anyone could tell it was not right.
Me as well, but only because the course was strictly analytic. And the single biggest lesson I took from it was that any interesting problem has no analytic solution. I might even argue that the interestingness of a problem is equivalent to its non-analyticalness. Fractals, chaos, etc. have their roots in non-analytical solutions.
I would have appreciated a course in numerical solutions to diffEQs, maybe even as part of a combined course. Numerical stability alone is worth a course or three, but even just an intro would have been extremely useful as an undergrad.
This just isn’t true. There are plenty of phenomena that are either exactly represented as a system of ODEs (linear circuits, simple oscillatory systems) or can be readily approximated by linearizing the system about a point of interest. Essentially all modern real time control system use linearized differential equations with any necessary corrections of “higher order terms” (h.o.t.) represented by some kind of separate corrector or lookup tables. Until we had reliable real-time embedded systems, all control systems were essentially analog computers that represent a linearized system (albeit with some digital features for doing actual calculations) and even now digital control systems are conceptualized as systems of differential equations that have approximately analytic solutions.
This isn’t to say that numerical methods in solving and representing systems of differential equations aren’t extremely useful but you have to understand the underlying methods of formulating and solving differential equations analytically first, just as you have to understand analytical mechanics of materials before you understand how to develop and use finite element analysis for structural analysis. The same is true for thermofluid mechanics, heat transfer, and any other area where you want to discretize or approximate a problem to solve it numerically.
Although the development of fractals come out of discrete mathematics on geometry, analysis on fractals is the application of calculus and differential equations to describe fractals in a way that is useful for representing fractal behavior in mechanics and dynamical systems (i.e. the use of fractal behavior to model turbulence or fractal growth patterns in populations). Differential equations is fundamental to describing and understanding any real world in a way that can be represented in terms of essential physics.
A pendulum can be modeled, but a double pendulum cannot. That’s a device with just two components! Not to mention the well-known 3-body problem.
I’m not saying to spend zero time on it. But in my undergrad curriculum (i.e., the math+physics that all engineers took), we only had a class on analytic solutions, and spent zero time on numerical ones. That should have been better balanced, IMO, with the analytic part serving as an intro to the numeric one.
Some of it. Others came directly out of the study of differential equations. For instance, the Lorenz system has fractal properties:
I’m an engineer not well-versed in pure math, but I used Cramer’s Rule a good bit over the years. Coding it into spreadsheets and pulling them up when needed is quicker than Gaussian elimination.
I came here to say that the overlap between folks I work with who think that algebra is useless and the ones who can’t figure out the impact of product mix on margin rates, or can’t figure out the impact of leveraging fixed costs on net margin in a conceptual or repeatable way is 100%.
Every time they do it for a specific set of numbers, they have to ask me or one of the other “brainiacs” to set it up for them.
And these people are working in Corporate Finance, some of them for decades and all of them earning over $100k/year, some over $200k.
I was pretty good at math; good enough to sometimes win or come in the top few in national competitions.
Although there are many, many concepts I have never used (more outside of contests), some of which were inefficient techniques, I could not call them useless. If I had to calculate the area of a triangle, I would be unlikely to use the law formulated by Heron by first defining s as half the sum of the three sides and taking the root of some fancy formula. But it is still impressive for its time, is creative, could potentially be useful for confirmation.
I am long, long past the days when I would have found it interesting to prove a formula like that, but proofs are part of the “beauty” of mathematics if you take that point of view. Or the history of it, if you do not. Maybe.
Most math most learn is hundreds of years old. Or older. Knowing almost all of the maths up to even a hundred years ago is an impressive feat. What is useless depends on what you need to use; similarities exist between many seemingly unrelated topics.
Matrix math defines everything in robotics, though I never loved it. Complex abstract calculus was strange and I never used it, but electrical engineers do. Knowing many ways to skin a cat may be overrated but again - beauty, history, perhaps. It has been decades since I have used a partial differential equation but that does not make it useless either. Despite the other thread it is easier for me to denigrate stuff which is wrong or obsolete rather than something of great practical value to someone but not me.
We had university exams (Engineering Physics) where the answer would reduce to seven equations in seven unknowns. You couldn’t really solve it further during the exam.
Personally, I just think it would be easier to teach polynomial multiplication by teaching it like arithmetic multiplication, building on something you already know how to do. That’s what they do with polynomial (long) division.
Personally, I can’t think of any math concepts I found completely useless. But I admit our teacher basically skipped over solving systems of equations with matrices. We got one day on it, because the book covered it exactly once. And it seemed like just doing the same steps as when you wrote out the equations, just without writing the x, y, or z. It didn’t seem to have anything to do with what we’d learned about matrices before.
Differential equations were also not something I used, but I found them kinda fun, at least, in Cal 1. The only part I hated was them making us memorize so many formulas instead of letting us just derive the ones that were rarely used. We didn’t really need 14 different ones.
But what my algebra ‘teacher’ was demanding was some kind of convoluted inverse of this (with the letters “F-O-I-L” listed as the vertical labels of a table) where you were supposed to take an unfactored quadratic and somehow delved the factored equation that only worked if the equation had integer roots. I recall some kind of iterative process where you assumed one root to be zero and then had to count up or down until you got some kind of integer result. It made no sense, took a big chunk of a page, and required writing out the answer twice. It felt like she’d learned some weird kind of way of solving a small subset of quadratics and insisted that everybody use it because she didn’t know any other way of solving equations.
Though I do question the amount of time spent on some topics, often because they were of historical importance. Time may have magnified monotonous memories, maybe, but it seems we spent a lot of time on geometry. It’s not that Euclid’s work was not important. It just seems like we spent months showing this angle had some precise relation to another one. Still, there are worse topics.