One of the most important functions of smartphones and tablets is capturing, storing and viewing images and videos. Never before was it so easy and effortless to take a picture. As a result, image collections grow increasingly on mobile devices. However, their current browsing interfaces are not designed to support users handling large collections in an efficient way. It is hard to find images again when users do not exactly know where the photo was filed. Users end up scrolling endlessly. The same can be said for video collections or even navigation inside a video. Discovering important scenes without knowing the exact time code is hard. The improvement of this situation is the focus of this thesis. In a first step, the status quo is evaluated regarding mobile media usage on smartphones and tablets by performing a survey with 215 participants. In the next step, the implications of the survey influence the development of a variety of mobile 2D and 3D image and video browsing interfaces. To enable these approaches, results of content analysis are utilized. In case of images, the dominant color is used for sorting purposes. Moreover, videos are segmented with the help of a new sub-shot based approach. Three user studies have been performed - two for image browsing and one for video browsing. They show an important difference between browsing on small vs. large touchscreens and prove the utility of sub-shot visualization in a mobile setting. Furthermore, additional mobile video browsing interfaces are introduced. Finally, as content analysis is extensively utilized, performance of current smartphones and tablets regarding well-known OpenCV functions is measured, listed and discussed.