Patents - stay tuned to the technology

Inventors list

Assignees list

Classification tree browser

Top 100 Inventors

Top 100 Assignees


Doepke

Frank Doepke, San Jose, CA US

Patent application numberDescriptionPublished
20110249756Skin Tone and Feature Detection for Video Conferencing Compression - In many videoconferencing applications, bandwidth is at a premium, and thus, it is important to encode a given video frame intelligently. It is often desirable that a larger amount of information be spent encoding the more important parts of the video frame, e.g., human facial features, whereas the less important parts of the video frame can be compressed at higher rates. Thus, there is need for an apparatus, computer readable medium, processor, and method for intelligent skin tone and facial feature aware videoconferencing compression that can “suggest” intelligent macroblock compression ratios to a video encoder. The suggestion of compression rates can be based at least in part on a determination of which macroblocks in a given video frame are likely to contain skin tones, likely to contain features (e.g., edges), likely to contain features in or near skin tone regions, or likely to contain neither skin tones nor features.10-13-2011
20120036433Three Dimensional User Interface Effects on a Display by Using Properties of Motion - The techniques disclosed herein use a compass, MEMS accelerometer, GPS module, and MEMS gyrometer to infer a frame of reference for a hand-held device. This can provide a true Frenet frame, i.e., X- and Y-vectors for the display, and also a Z-vector that points perpendicularly to the display. In fact, with various inertial clues from accelerometer, gyrometer, and other instruments that report their states in real time, it is possible to track the Frenet frame of the device in real time to provide a continuous 3D frame-of-reference. Once this continuous frame of reference is known, the position of a user's eyes may either be inferred or calculated directly by using a device's front-facing camera. With the position of the user's eyes and a continuous 3D frame-of-reference for the display, more realistic virtual 3D depictions of the objects on the device's display may be created and interacted with by the user.02-09-2012
20120081579High Dynamic Range Transition - In personal electronic devices including digital imaging capability, methods, devices and computer readable media are described for determining when image capture operations may benefit from using high dynamic range imaging (HDRI) operations. In general, techniques are disclosed for analyzing an image's luminosity and/or color/tonal histograms to automatically determine when HDRI operations can benefit scene capture. If the determination that HDRI operations can improve scene capture, the user is so notified.04-05-2012
20120105672Auto Exposure Blowout Prevention - Systems, methods, and a computer readable medium for performing an improved blowout prevention process in an image capture device are provided to compensate for occurrences of exposure “blowouts,” i.e., areas in a captured image where pixel brightness exceeds the sensor's dynamic range of capturing capability. In one embodiment, the captured image's histogram may be analyzed to determine if the image is indicative of the presence of exposure blowouts. Once it has been determined that there likely are blowouts in the image, an exposure bias for the image capture device may be set accordingly. Particularly, the exposure value (EV) for the image capture device may be gradually corrected, e.g., by one-eighth of a stop per captured frame, until the image histogram is no longer indicative of blown out regions, at which point the image capture device's exposure value may gradually be corrected back to “normal,” i.e., non-exposure bias compensated, levels.05-03-2012
20120293607Panorama Processing - This disclosure pertains to devices, methods, and computer readable media for performing panoramic photography processing techniques in handheld personal electronic devices. A few generalized steps may be used to carry out the panoramic photography processing techniques described herein: 1.) acquiring image data from the electronic device's image sensor's image stream; 2.) displaying a scaled preview version of the image data in real-time on the device's display; 3.) performing “motion filtering” on the acquired image data; 4.) generating full-resolution and lower-resolution versions of portions of the images that are not filtered out by the “motion filtering” process; 5.) substantially simultaneously “stitching” both the full-resolution and lower-resolution image portions together to create the panoramic scene; and 6.) substantially simultaneously sending the stitched version of the lower-resolution image portions to a preview region on the device's display and storing the stitched version of the full-resolution image portions to a memory.11-22-2012
20120293608Positional Sensor-Assisted Perspective Correction for Panoramic Photography - This disclosure pertains to devices, methods, and computer readable media for performing positional sensor-assisted panoramic photography techniques in handheld personal electronic devices. Generalized steps that may be used to carry out the panoramic photography techniques described herein include, but are not necessarily limited to: 1.) acquiring image data from the electronic device's image sensor; 2.) performing “motion filtering” on the acquired image data, e.g., using information returned from positional sensors of the electronic device to inform the processing of the image data; 3.) performing image registration between adjacent captured images; 4.) performing geometric corrections on captured image data, e.g., due to perspective changes and/or camera rotation about a non-center of perspective (COP) camera point; and 5.) “stitching” the captured images together to create the panoramic scene, e.g., blending the image data in the overlap area between adjacent captured images. The resultant stitched panoramic image may be cropped before final storage.11-22-2012
20120293609Positional Sensor-Assisted Motion Filtering for Panoramic Photography - This disclosure pertains to devices, methods, and computer readable media for perforating positional sensor-assisted panoramic photography techniques in handheld personal electronic devices. Generalized steps that may be used to carry out the panoramic photography techniques described herein include, but are not necessarily limited to: 1.) acquiring image data from the electronic device's image sensor; 2.) performing “motion filtering” on the acquired image data, e.g., using information returned from positional sensors of the electronic device to inform the processing of the image data; 3.) performing image registration between adjacent captured images; 4.) performing geometric corrections on captured image data, e.g., due to perspective changes and/or camera rotation about a non-center of perspective (COP) camera point; and 5.) “stitching” the captured images together to create the panoramic scene, e.g., blending the image data in the overlap area between adjacent captured images. The resultant stitched panoramic image may be cropped before final storage.11-22-2012
20120293610Intelligent Image Blending for Panoramic Photography - This disclosure pertains to devices, methods, and computer readable media for performing positional sensor-assisted panoramic photography techniques in handheld personal electronic devices. Generalized steps that may be used to carry out the panoramic photography techniques described herein include, but are not necessarily limited to: 1.) acquiring image data from the electronic device's image sensor; 2.) performing “motion filtering” on the acquired image data, e.g., using information returned from positional sensors of the electronic device to inform the processing of the image data; 3.) performing image registration between adjacent captured images; 4.) performing geometric corrections on captured image data, e.g., due to perspective changes and/or camera rotation about a non-center of perspective (COP) camera point; and 5.) “stitching” the captured images together to create the panoramic scene, e.g., blending the image data in the overlap area between adjacent captured images. The resultant stitched panoramic image may be cropped before final storage.11-22-2012
20120294549Positional Sensor-Assisted Image Registration for Panoramic Photography - This disclosure pertains to devices, methods, and computer readable media for performing positional sensor-assisted panoramic photography techniques in handheld personal electronic devices. Generalized steps that may be used to carry out the panoramic photography techniques described herein include, but are not necessarily limited to: 1.) acquiring image data from the electronic device's image sensor; 2.) performing “motion filtering” on the acquired image data, e.g., using information returned from positional sensors of the electronic device to inform the processing of the image data; 3.) performing image registration between adjacent captured images; 4.) performing geometric corrections on captured image data, e.g., due to perspective changes and/or camera rotation about a non-center of perspective (COP) camera point; and 5.) “stitching” the captured images together to create the panoramic scene, e.g., blending the image data in the overlap area between adjacent captured images. The resultant stitched panoramic image may be cropped before final storage.11-22-2012
20120307000Image Registration Using Sliding Registration Windows - This disclosure pertains to devices, methods, and computer readable media for performing image registration. A few generalized steps may be used to carry out the image registration techniques described herein: 1) acquiring image data from an image sensor; 2) selecting a pair of overlapping image portions from the acquired image data for registration; 3) determining an area of “maximum energy” in one of the image portions being registered; 4) placing an image registration window over both image portions at the determined location of maximum energy; 5) registering the overlapping image portions using only the image data falling within the image registration windows; and 6) determining, according to one or more metrics, whether the image registration window should be shifted from a current location before registering subsequently acquired image portions.12-06-2012
20130154935Adaptive Acceleration of Mouse Cursor - Disclosed herein are methods and systems for providing a user interface (UI) having a selector controllable by a physical input device. The response of the selector is adaptively adjusted to facilitate executing desired operations within the UI. A response factor defines how far the selector moves for a given movement of the physical input device. The response factor is increased so the selector can be moved a large distance, but is dynamically decreased to provide fine-tuned control of the selector for selecting densely grouped screen elements. Screen elements can be endowed with gravity, making them easy to select, or with anti-gravity, making them more difficult to select. The disclosure methods also provide tactile feedback such as vibration or braking of the physical input device to assist a user in executing desired operations.06-20-2013
20130329001Motion Adaptive Image Slice Selection - Systems, methods, and computer readable media for adaptively selecting what portion (aka slice) of a first image (aka frame) is selected to overlap and blend with a second frame during frame capture operations are disclosed. In general, for every new frame captured in a sequence the overlap between it and the slice selected from a prior frame may be determined based, at least in part, on sensor output. If the overlap so determined is below a desired threshold, the position of the current frame's slice may be adjusted so as to provide the desired overlap.12-12-2013
20130329070Projection-Based Image Registration - Systems, methods, and computer readable media to register images in real-time and that are capable of producing reliable registrations even when the number of high frequency image features is small. The disclosed techniques may also provide a quantitative measure of a registration's quality. The latter may be used to inform the user and/or to automatically determine when visual registration techniques may be less accurate than motion sensor-based approaches. When such a case is detected, an image capture device may be automatically switched from visual-based to sensor-based registration. Disclosed techniques quickly determine indicators of an image's overall composition (row and column projections) which may be used to determine the translation of a first image, relative to a second image. The translation so determined may be used to align/register the two images.12-12-2013
20130329071Image Blending Operations - Procedures are described for blending images in real-time that avoid ghosting artifacts (attributable to moving objects), maintain the proper appearance of contiguous edges in the final image, and permits the use of fast (real-time) blending operations. A “guard-band” may be defined around an initially identified seam that perturbs the path of the initial seam so that both the seam and the guard-band's edges avoid moving objects by at least a specified amount. Rapid blend operations may then be performed in the region demarcated by the guard-band. The seam may be further adjusted to bias its position toward a specified trajectory within the overlap region when there is no moving object present. If visual registration techniques are not able to provide a properly aligned overlap region, motion sensor data for the image capture device, may be used instead to facilitate blending operations.12-12-2013
20130329132Flare Detection and Mitigation in Panoramic Images - Lens flare mitigation techniques determine which pixels in images of a sequence of images are likely to be pixels affected by lens flare. Once the lens flare areas of the images are determined, unwanted lens flare effects may be mitigated by various approaches, including reducing border artifacts along a seam between successive images, discarding entire images of the sequence that contain lens flare areas, and using tone-mapping to reduce the visibility of lens flare.12-12-2013
20140126819Region of Interest Based Image Registration - Techniques for registering images based on an identified region of interest (ROI) are described. In general, the disclosed techniques identify a region of ROI within an image and assign areas within the image corresponding to those regions more importance during the registration process. More particularly, the disclosed techniques may employ user-input or image content information to identify the ROI. Once identified, features within the ROI may be given more weight or significance during registration operations than other areas of the image having high-feature content but which are not as important to the individual capturing the image.05-08-2014
20140195978GRANULAR GRAPHICAL USER INTERFACE ELEMENT - A graphical user interface (GUI) element permits a user to control an application in both a coarse manner and a fine manner. When a cursor is moved to coincide or overlap the displayed GUI element, parameter adjustment is made at a first (coarse) granularity so that rapid changes to the target parameter can be made (e.g., displayed zoom level, image rotation or playback volume). As the cursor is moved away from the displayed GUI element, parameter adjustment is made at a second (fine) granularity so that fine changes to the target parameter can be made. In one embodiment, the further the cursor is moved from the displayed GUI element, the finer the control.07-10-2014
20140362173Exposure Mapping and Dynamic Thresholding for Blending of Multiple Images Using Floating Exposure - Special blend operations for wide area-of-view image generation utilizing a “floating auto exposure” scheme are described. Pixel values in the two images being stitched together are blended within a transition band around a “seam.” identified in the overlap region between the images after changes in exposure and/or color saturation are accounted for. In some embodiments, changes in exposure and/or color saturation are accounted for through the use of one or more exposure mapping curves, the selection and use of which are based, at least in part, on a determined “Exposure Ratio” value, i.e., the amount that the camera's exposure settings have deviated from their initial capture settings. In other embodiments, the Exposure Ratio value is also used to determine regions along the seam where either: alpha blending, Poisson blending—or a combination of the two—should be used to blend in the transitional areas on each side of the seam.12-11-2014

Patent applications by Frank Doepke, San Jose, CA US

Frank Doepke, Cupertino, CA US

Patent application numberDescriptionPublished
20110293259Scene Adaptive Auto Exposure - Systems, methods, and a computer readable medium for an improved automatic exposure algorithm attempt to classify an image into a particular “scene category,” and, based on the determined scene category, meter the scene according to a generated metering weighting matrix. In one embodiment, the average luminance is calculated for a central exposure metering region of the image and a plurality of peripheral exposure metering regions surrounding the central exposure metering region. Based on comparisons of the average luminance values of the peripheral exposure regions to the average luminance of the central exposure region, a target metering weighting matrix may be generated. In another embodiment, the scene category corresponds to a predetermined metering weighting matrix. In video applications, it may be desirable to reduce oscillations in metering parameter values to limit any visually jarring effects on the camera's preview screen by only adjusting metering parameter values when predetermined criteria are met.12-01-2011

Hubertus Doepke, Rosenheim DE

Patent application numberDescriptionPublished
20140247009Motor Vehicle - A motor vehicle includes at least one electric motor for driving the motor vehicle, at least one electrical energy storage device via which the electric motor can be supplied with electrical current, and a charging device having at least one coil via which electrical energy can be inductively transmitted for charging the electrical energy storage device. An internal combustion engine of the motor vehicle includes a reservoir in which lubricant for lubricating the internal combustion engine can be received and on which the coil, and optionally at least one electronics component associated with the coil, is arranged.09-04-2014

Matthias Doepke, Garbsen DE

Patent application numberDescriptionPublished
20090121721DEVICE FOR MONITORING CELL VOLTAGE - A device for monitoring a rechargeable battery having a number of electrically connected cells includes at least one current interruption switch for interrupting current flowing through at least one associated cell and a plurality of monitoring units for detecting cell voltage. Each monitoring unit is associated with a single cell and includes a reference voltage unit for producing a defined reference threshold voltage and a voltage comparison unit for comparing the reference threshold voltage with a partial cell voltage of the associated cell. The reference voltage unit is electrically supplied from the cell voltage of the associated cell. The voltage comparison unit is coupled to the at least one current interruption switch for interrupting the current of at least the current flowing through the associated cell, with a defined minimum difference between the reference threshold voltage and the partial cell voltage.05-14-2009
Website © 2015 Advameg, Inc.