In recent years, low cost, high frame rate 3D or range cameras, which simultaneously provide distance and intensity information have become commercially available. These cameras have gained a lot of interest in numerous applications. Indoor mapping and autonomous mobile navigation are two example applications, which have a lot of potential for such cameras. Motion estimation is an integral part of these applications. Therefore, it is vital to investigate methods and techniques for motion estimation, which can exploit the simultaneous availability of range and intensity information provided by these 3D cameras. This thesis investigates the integration of range and intensity data for the task of motion estimation. The motion estimation of a moving camera and motion estimation of independently moving objects are both investigated. The integration of range and intensity information is realized using range flow and optical flow constraints, which have been used for motion estimation in range and intensity images respectively. Range flow and optical flow lead to similar mathematical formulations, therefore they are well integrated into one estimation problem. Using these \textit Matching distinctive features in images helps to identify loop closures and revisit of an area, which is essential in obtaining a globally consistent trajectory. However, in indoor environments features may be sparse and due to similar looking environment robust feature matching can be very challenging. Thus, the solution proposed in this thesis, utilizes the estimated relative orientations in the bundle adjustment. So, even when the feature points are low in number and not well distributed across the image, the orientation can still be accurately estimated by using information from the relative orientation. The proposed algorithm is evaluated on a publicly available dataset and benchmark, which shows that the algorithm performs well in comparison to the state of the art algorithm. Furthermore, using variance component analysis in bundle adjustment, it is shown that the original accuracy estimates of the relative orientation are far too optimistic. Furthermore, this thesis presents a method for dense 3D motion estimation of independently moving objects with a static camera, which is also based on the integration of range flow and optical flow constraints. This method is based on two steps, in the first step the motion is estimated locally, while in the second step a global regularization is performed, which leads to smooth dense flow vectors. The advantage of such an approach is that it leads to a linear equation system, which is then iteratively solved to remove the outliers. In the end, an example of motion estimation on a landslide is presented. The motion estimation is realized using range flow constraint, which is applied on raster based digital surface models generated from the multi-temporal laser scanning data of a landslide surface. The thesis demonstrates the feasibility and the benefits of integrating range and intensity data, of combining global and local models, and finally of considering stochastic properties of the measurements in the parameter estimation.