Correlations in many-body quantum states lead to interesting phenomena while at the same time being enormously difficult to describe theoretically. Recent experiments conducted with ultracold atoms in optical lattices allow to explore quantum many-body correlations in a controlled and tunable way. The underlying theoretical model is the single-band Hubbard model. In this thesis we investigate the one-dimensional Hubbard model by applying the newly developed time-dependent two-particle reduced density matrix method. The method is based upon the assumption that three-particle correlations are negligible. We assess this approximation for equilibrium and non-equilibrium states. We show that the method is well suited for weakly interacting and weakly correlated systems.