Sorry I couldn’t come up with a more intelligent title. Here’s the basic problem. Say I have two non-periodic curves of changing amplitude that have been added together to form a single curve which represents the sum. I know the shape of each curve (remember, only the amplitude is changing), and I know the sum curve at any time t.
Two theories present themselves:
The amplitudes represent a plane (two independent variables), so there is no way to look at a single summed curve and determine the two amplitudes (i.e. the set of solutions is a line).
Since the curves are definitely different and do not have overlapping peaks or troughs, you should be able to determine the curve by, at least, comparing every possible amplitude-combination of the two known curves to the known sum.
Both make sense to me, and one says “you can do it”, the other says “you can’t do it.”
So can I do it? What mathematical techniques should I investigate, Fourier analysis?
I’m not quite sure if I understand the question correctly.
Do you mean something akin to Af(t)+Bg(t) = h(t) and you know h(t) and you want to figure out A and B ?
If that’s what the OP is asking (and that’s what I got out of it), then it’s certainly possible. You just pick two values of t (call them t[sub]1[/sub] and t[sub]2[/sub]), and solve the system of linear equations
f(t[sub]1[/sub]) A + g(t[sub]1[/sub]) B = h(t[sub]1[/sub])
f(t[sub]2[/sub]) A + g(t[sub]2[/sub]) B = h(t[sub]2[/sub])
This is just two equations in two unknowns, A and B, and so should be generically soluble.
It’s possible that you’ll run into odd situations when you can’t uniquely solve for A and B; for instance, if f(t[sub]1[/sub]) = g(t[sub]1[/sub]) and f(t[sub]2[/sub]) = g(t[sub]2[/sub]), then you won’t be able to solve. But if you pick your two points such that
(i.e. the determinant of the matrix multiplying A & B is not equal to zero), then you should be able to solve in general.
Also, note that this only works if you know that h(t) is a linear combination of f(t) and g(t). A generic function h(t) can’t be decomposed in this way.
If that is the problem then I would have proposed the same solution as MikeS . You can even take it one step further. If you would have two sums with different amplitudes such as:
A1f(t) + B1g(t) = h1(t)
A2f(t) + B2g(t) = h2(t)
then you wouldn’t even need to know the functions f(t) and g(t) to (approximately) recreate the amplitudes A1, A2, B1 and B2.
It sounds to me like you’ve got something along the lines of A(t)f(t) + B(t)g(t) = h(t), which makes this a little tougher.
Hmm…
If you know all five functions are differentiable, you might be able to use the fact that h’(t) = A’(t)f(t) + A(t)f’(t) + B’(t)g(t) + B(t)g’(t) to get some information.
But you don’t know the general form of h(t), do you?
Something like A(t)f(t) + B(t)g(t) = h(t) can’t be solved in general, unless you know A(t) and B(t) – counter-example A(t)=af[sup]-1/sup, B(t)=bg[sup]-1/sup, where a and b are constants.
However, something like Af(t+d) + Bg(t) = h(t), where A, d, and B are all unknown would be a challenging problem – I suppose it depends a lot on f and g.
Quite close. A and B change with respect to time, it is almost like ultrafilter described, except I’d write:
h(t) = A(t)f(x) + B(t)g(x)
The two curves, f(x) and g(x) are totally fixed, for example like looking at
f(x) = ax[sup]2[/sup] + bx + c
the entire time for a fixed domain of x which is not time-dependent. Now we are suggesting that this curve is multiplied by some constant which is time-dependent (this function is not necessarily linear and is some kind of bounded exponential growth curve of various complexities). Take two certainly different curves of this type and sum them together, and that’s where I’m at. The x seems to add another variable to the equation but for all intents and purposes it can be ignored since the domain of the two x functions is always known.
If it helps since it has been mentioned, the f and g functions are continuously differentiable. The problem is that they’d have to be created from something like fourier analysis, or some other numerical technique (i.e. we’re creating functions from data). But one way or another, the free variable in f and g is known the entire time and is not changing with respect to time at all.
For instance, I’d have something like:
h(t) = A(t)sinx + B(t)cosx
Where t goes from 0 to n
Where I know:
h(t) at any time t[sub]i[/sub]
A(t[sub]n[/sub])sinx
B(t[sub]n[/sub])cosx
Meaning:
I end up deducing some part of the taylor expansion for sin x and cos x because of the numerical techniques, and call the f(x) and g(x). The data I’m using here is always taken at t[sub]n[/sub].
I get the data we’re calling h(t[sub]n[/sub])
I know that h(t) is the sum of f and g, and need to discover A(t[sub]n[/sub]) and B(t[sub]n[/sub]). Not the functions, A and B, but at any time t[sub]n[/sub] I need to know what contribution f and g are making to h.
whew! thanks for the thoughts so far guys, anything I can do to clear things up please let me know.
No. It just so happens that f and g are functions of x. x is not time-dependent, and we can just consider single values of x instead of the entire domain, meaning f and g are just constants.
If h(x, t) = A(t)f(x) + B(t)g(x), then the first partial derivatives wrt to x is h[sub]x[/sub](x, t) = A(t)f’(x) + B(t)g’(x). That gives you a system of two equations with A(t) and B(t) as the unknowns. But again, that requires you to know the exact form of h, and that h is differentiable.
Well, no. h is a linear combination of functions of x, so it is also a function of x. You may be willing to consider single points, but that doesn’t mean that h doesn’t depend on x. And even if it doesn’t, h is a constant function of x.
Then, yeah, h is also a function of x. f and g are definitely differentiable, and A and B are, as I’ve said, some kind of bounded exponential growth which is also differentiable. As I recall, this should suggest that h is differentiable.
But your last post seems to make me think that there’s no clear way to tell how A and B have contributed to h uniquely. That’s the essential question. (Plus, where I’d look to find out how to do it.)
Actually, I can consider over a hundred values of x, yes. Also, h(0,x)=0 and if necessary we could know h(n,x) for all x (and still saying n is the last t).
Using h(x, t) = A(t)f(x) + B(t)g(x), for any given t, A(t) and B(t) can be determined easily using h(x,t) for two values of x. If there’s some noise, so you don’t know h(x,t) exactly, you might want to average over several values of x.
That doesn’t seem that hard, so I’m wondering if what you really have is h(x, t) = A(x,t)f(x) + B(x,t)g(x). In that case, A(x,t) and B(x,t) are underdetermined. You’d need some other condition to define them. Possibly, you’d want something like A(x,t) and B(x,t) are slowly varying with respect to x.
A and B are definitely only dependent on t and have nothing to do with x. The x functions are intrinsic physical properties. Noise is already covered on the f and g functions.
Is there a name for the mathematics I’d need to use to figure out how to do this, or should I just crack open my calculus books and start looking?