In almost any imaginable theory an experiment involves the preparation of a system in what we call a “state” and the subsequent observation of various quantities which we call “observables”. The observable quantities might be lengths, colours, states of matter, relative phases of signals, or many other things – but they can almost always be reduced to just real numbers or even just sequences of yes/no questions. And the prepared states typically correspond to the specification of some particular observable values by filtering out those that don’t meet the required criteria. States which are specified as completely as possible are called “pure states” but it is also possible to prepare a “mixed state” in which the specification is not complete and so where the values of some quantities which could be specified more precisely (without interfering with others already specified) are given just by some statistical distribution. In quantum theories the complete specification of some observables may make it impossible to also have a complete specification of others and so even in pure states the values of some observables may not be completely determined. In such cases the state cannot be written as a statistical mixture of ones in which the problem variable is completely determined (which are called eigenstates of that observable), but rather as a linear combination (aka superposition) of the corresponding wave functions.
The measurement process has several stages, and a lot of confusion about what is meant by an “observer” in QM arises from not keeping them separate.
Two of these stages are often identified with the notion of “collapse”.
One is where a small part of the world appearing to be in a pure state (which, for any particular observable, may not be an eigenstate but just a linear superposition of the eigenstates corresponding to different values of the observable) interacts with some part of the external world (which is not in a pure state) in such a way that the small part ends up appearing to be in a classical statistical mixture of eigenstates. After this interaction is complete the observer still may not know which eigenvalue applies (ie what the observed value of the measurement will be), but the situation will be no more (nor less!) mysterious than that of a coin toss which has not yet been observed. For this first stage of collapse (which has been understood in principle since von Neumann, but for which in the last few decades specific more detailed examples have come to be touted as “decoherence”) the part of the world “causing” the collapse could be anything from surrounding thermal radiation, to an actual measurement instrument, or a cat, or another human observer (cf “Wigner’s friend”) who learns the truth before you do.
But if you are the observer we are interested in, then the system remains in a mixed state until you become aware of the result, and collapse of the classical probability distribution happens only in your mind.
So tl;dr there is no “collapse of the wave”. What there is are first decoherence of the wave (which can be caused by interaction with almost anything that is even slightly complicated), and then, later, collapse of the resulting probability distribution which is where you and your “consciousness” come in.