Advertisement



  • ROBUST OBJECT TRACKING USING JOINT COLOR-TEXTURE HISTOGRAM This report is written by JIFENG NING,LEI ZHANG and DAVID ZHANG, CHENGKE WU

    A novel object tracking algorithm is presented in this paper by using the joint colortexture
    histogram to represent a target and then applying it to the mean shift framework.
    Apart from the conventional color histogram features, the texture features of
    the object are also extracted by using the local binary pattern (LBP) technique to
    represent the object. The major uniform LBP patterns are exploited to form a mask
    for joint color-texture feature selection. Compared with the traditional color histogram
    based algorithms that use the whole target region for tracking, the proposed algorithm
    extracts effectively the edge and corner features in the target region, which characterize
    better and represent more robustly the target. The experimental results validate that
    the proposed method improves greatly the tracking accuracy and efficiency with fewer
    mean shift iterations than standard mean shift tracking. It can robustly track the target
    under complex scenes, such as similar target and background appearance, on which the
    traditional color based schemes may fail to track.


    Introduction :

    Real-time object tracking is a critical task in computer vision applications. Many
    tracking algorithms have been proposed to overcome the difficulties arising from
    noise, occlusion, clutter and changes in the foreground object or in the background
    environment. Among the various tracking algorithms,mean shift tracking algorithms
    have recently become popular due to their simplicity and efficiency.

    The mean shift algorithm was originally proposed by Fukunaga and Hostetler
    for data clustering. It was later introduced into the image processing community by
    Cheng. Bradski odified it and developed the Continuously Adaptive Mean Shift
    (CAMSHIFT) algorithm to track a moving face. Comaniciu and Meer successfully
    applied mean shift algorithm to image segmentation and object tracking. Mean
    Shift is an iterative kernel-based deterministic procedure which converges to a local
    maximum of the measurement function with certain assumptions on the kernel
    behaviors. Furthermore, mean shift is a low complexity algorithm, which provides
    a general and reliable solution to object tracking and is independent of the target
    representation.

    The texture patterns, which reflect the spatial structure of the object,
    are effective features to represent and recognize targets. Since the texture features
    introduce new information that the color histogram does not convey, using the joint
    color-texture histogram for target representation is more reliable than using only
    color histogram in tracking complex scenes. The idea of combining color and edge for
    target representation has been exploited by researchers.7,10 However, how to utilize
    effectively both the color intensity and texture features is still a difficult problem.

    This is because though many texture analysis methods, such as gray concurrence
    matrices9 and Gabor filtering, have been proposed, they have high computational
    complexity and cannot be directly used together with color histogram.
    Currently, a widely used form of target representation is the color histogram,
    which could be viewed as the discrete probability density function (PDF) of the
    target region. Color histogram is an estimating mode of point sample distribution
    and is very robust in representing the object appearance. However, using only color
    histograms in mean shift tracking has some problems. First, the spatial information
    of the target is lost. Second, when the target has similar appearance to the
    background, color histogram will become invalid to distinguish them. For a better
    target representation, the gradient or edge features have been used in combination
    with color histogram. Several object representations that exploit the spatial
    information have been developed by partitioning the tracking region into fixed size
    fragments, meaningful patches or the articulations of human objects. For each
    subregion, a color or edge feature based target model was presented.

    The local binary pattern (LBP)16,17 technique is very effective to describe the
    image texture features. LBP has advantages such as fast computation and rotation
    invariance, which facilitates the wide usage in the fields of texture analysis,
    image retrieval, face recognition, image segmentation, etc. Recently, LBP was successfully applied to the detection of moving objects via background
    subtraction. In LBP, each pixel is assigned a texture value, which can be naturally
    combined with the color value of the pixel to represent targets. In Ref. , Nguyen
    et al. employed the image intensity and the LBP feature to construct a twodimensional
    histogram representation of the target for tracking thermographic and
    monochromatic video.

    In this paper, we adopt the LBP scheme to represent the target texture feature
    and then propose a joint color-texture histogram method for a more distinctive and
    effective target representation. The major uniform LBP patterns are used to identify
    the key points in the target region and then form a mask for joint color-texture
    feature selection. The proposed target representation scheme eliminates smooth
    background and reduces noise in the tracking process. Compared with the traditional
    RGB color space based target representation, it efficiently exploits the target
    structural information and hence achieves better tracking performance with fewer
    mean shift iterations and higher robustness to various interferences of background
    and noise in complex scenes.

    The paper is organized as follows. Section 2 briefly introduces the mean shift
    algorithm. Section 3 analyzes LBP and presents the joint color-texture histogram
    scheme in detail. Experimental results are presented and discussed in Sec. 4.
    Section 5 concludes the paper.

    Download Here.

    more
  • Matlab development implementation Hello there, its have been a long time i have never update the content of this blog. Recently i find that are so headache to continue this project alone, but i will never give up until i really finish this project soon. The due date to submit the report might be 2 weeks from now. I have starting to write my report now.

    After i finish my report i will tell you guys what the outcomes that i get from the report.
    See you later.




    more
  • Learning Matlab Process To day i just want to inform you, i'm going to learn matlab to develop my software.

    What basic matlab i need to know is ;

    1) Basic feature of matlab
    2) Matlab desktop management
    3) Script M-file
    4) Arrays and Array Operations
    5) Multidimensional array
    6) Cell Array and Structures
    7) Relation and logical Operation
    8) Control Flow
    9) Functions
    10) Matric Algebra
    11) Fourier Analysis


    The thesis related using matlab will be cover after i finish above chapters.




    more
  • FIR filter A finite impulse response (FIR ) filter is a type of a digital filter. The impulse response, the filter's response to a Kronecker delta input, is finite because it settles to zero in a finite number of sample intervals. This is in contrast to infinite impulse response (IIR) filters, which have internal feedback and may continue to respond indefinitely. The impulse response of an Nth-order FIR filter lasts for N+1 samples, and then dies to zero.



    more
  • Journal List reference Journal List reference ;
    ---------------------------------

    From COMPLEX DISCRETE WAVELET TRANSFORM BASED MOTION ESTIMATION.pdf

    1) An iterative images registration technique with an application to stereo vision [B. Lucas and T. Kanade].pdf

    2) Determining Optical Flow [B. K. P. Horn and B.G. Schunk].pdf

    3) Computation of component image velocity from local phase information [D.J Fleet and A. D. Jepson] .pdf

    4) Performance Of Optical Flow [J. L. Barron - D.J Fleet and S.S Beauchemin].pdf

    5) Complex wavelets and shift invarience by N. G. Kingsbury

    6) Mesh-Base Motion Estimation and Compensation in the wavelet domain using a redundant transform [S. Cui - Y. Wang and J.E Fowler].pdf

    7) An Overcomplete Discrete Wavelet Transform For Video Compression by N. Sebe.pdf

    8) A New Framework for complex wavelet transform by F.C.A. Fernandes.pdf

    9) Motion Estimation Using a Complex-Valued Wavelet Transform by J. Magarey.pdf

    10) Real-Time Tracking of Non-Rigid Objects using Mean Shift[by D. Comaniciu].pdf



    more
  • What is MPEG-1? MPEG-1 is a standard for lossy compression of video and audio. It is designed to compress VHS-quality raw digital video and CD audio down to 1.5 Mbit/s (26:1 and 6:1 compression ratios respectively)[1] without excessive quality loss, making Video CDs, digital cable/satellite TV and digital audio broadcasting (DAB) possible.[2][3]
    Today, MPEG-1 has become the most widely compatible lossy audio/video format in the world, and is used in a large number of products and technologies. Perhaps the best-known part of the MPEG-1 standard is the MP3 audio format it introduced.
    The MPEG-1 standard is published as ISO/IEC-11172. The standard consists of the following five Parts:
    Systems (storage and synchronization of video, audio, and other data together)
    Video (compressed video content)
    Audio (compressed audio content)
    Conformance testing (testing the correctness of implementations of the standard)
    Reference software (example software showing how to encode and decode according to the standard)


    History

    Modeled on the successful collaborative approach and the compression technologies developed by the Joint Photographic Experts Group and CCITT's Experts Group on Telephony (creators of the JPEG image compression standard and the H.261 standard for video conferencing respectively) the Moving Picture Experts Group (MPEG) working group was established in January 1988. MPEG was formed to address the need for standard video and audio formats, and build on H.261 to get better quality through the use of more complex encoding methods.[2][4]
    Development of the MPEG-1 standard began in May 1988. 14 video and 14 audio codec proposals were submitted by individual companies and institutions for evaluation. The codecs were extensively tested for computational complexity and subjective (human perceived) quality, at data rates of 1.5 Mbit/s. This specific bitrate was chosen for transmission over T-1/E-1 lines and as the approximate data rate of audio CDs.[5] The codecs that excelled in this testing were utilized as the basis for the standard and refined further, with additional features and other improvements being incorporated in the process.[6]
    After 20 meetings of the full group in various cities around the world, and 4½ years of development and testing, the final standard (for parts 1-3) was approved in early November 1992 and published a few months later.[7] The reported completion date of the MPEG-1 standard, varies greatly: a largely complete draft standard was produced in September 1990, and from that point on, only minor changes were introduced.[2] The draft standard was publicly available for purchase.[8] The standard was finished with the 6 November 1992 meeting[9]. The Berkeley Plateau Multimedia Research Group developed a MPEG-1 decoder in November 1992.[10] In July 1990, before the first draft of the MPEG-1 standard had even been written, work began on a second standard, MPEG-2,[11] intended to extend MPEG-1 technology to provide full broadcast-quality video (as per CCIR 601) at high bitrates (3 - 15 Mbit/s), and support for interlaced video.[12] Due in part to the similarity between the two codecs, the MPEG-2 standard includes full backwards compatibility with MPEG-1 video, so any MPEG-2 decoder can play MPEG-1 videos.[13]
    Notably, the MPEG-1 standard very strictly defines the bitstream, and decoder function, but does not define how MPEG-1 encoding is to be performed (although a reference implementation is provided in ISO/IEC-11172-5).[1] This means that MPEG-1 coding efficiency can drastically vary depending on the encoder used, and generally means that newer encoders perform significantly better than their predecessors.[14] The first three parts (Systems, Video and Audio) of ISO/IEC 11172 were published in August 1993.

    Patents

    MPEG-1 video and Layer I/II audio may be able to be implemented without payment of license fees.[16][17] [18][19][20] The ISO patent database lists one patent for ISO 11172, US 4,472,747, which expired in 2003.[21] The near-complete draft of the MPEG-1 standard was publicly available as ISO CD 11172[8] by December 6, 1991.[22] Due to its age, many of the patents on the technology have expired. Neither the Kuro5hin article "Patent Status of MPEG-1,H.261 and MPEG-2"[23] nor a thread on the gstreamer-devel[24] mailing list were able to list a single unexpired MPEG-1 video and Layer I/II audio patent. A full MPEG-1 decoder and encoder can not be implemented royalty free since there are companies that require patent fees for implementations of MPEG-1 Layer 3 Audio however.

    Applications

    • Most popular computer software for video playback includes MPEG-1 decoding, in addition to any other supported formats.
    • The popularity of MP3 audio has established a massive installed base of hardware that can playback MPEG-1 Audio (all 3 layers).
    • "Virtually all digital audio devices" can playback MPEG-1 Audio.[25] Many millions have been sold to-date.
    • Before MPEG-2 became widespread, many digital satellite/cable TV services used MPEG-1 exclusively.[4][14]
    • The widespread popularity of MPEG-2 with broadcasters means MPEG-1 is playable by most digital cable and satellite set-top boxes, and digital disc and tape players, due to backwards compatibility.
    • MPEG-1 is the exclusive video and audio format used on Video CD (VCD), the first consumer digital video format, and still a very popular format around the world.
    • The Super Video CD standard, based on VCD, uses MPEG-1 Audio exclusively, as well as MPEG-2 video.
    • The DVD-Video format uses MPEG-2 video primarily, but MPEG-1 support is explicitly defined in the standard.
    • The DVD Video standard originally required MPEG-1 Layer II audio for PAL countries, but was changed to allow AC-3/Dolby Digital-only discs. MPEG-1 Layer II audio is still allowed on DVDs, although newer extensions to the format, like MPEG Multichannel, are rarely supported.
    • Most DVD players also support Video CD and MP3 CD playback, which use MPEG-1.
    • The international Digital Video Broadcasting (DVB) standard primarily uses MPEG-1 Layer II audio, and MPEG-2 video.
    • The international Digital Audio Broadcasting (DAB) standard uses MPEG-1 Layer II audio exclusively, due to MP2's especially high quality, modest decoder performance requirements, and tolerance of errors.





    more
  • what is .MP4 file? MPEG-4 Part 14, formally ISO/IEC 14496-14:2003, is a multimedia container format standard specified as a part of MPEG-4. It is most commonly used to store digital video and digital audio streams, especially those defined by MPEG, but can also be used to store other data such as subtitles and still images. Like most modern container formats, MPEG-4 Part 14 allows streaming over the Internet. A separate hint track is used to include streaming information in the file. The official filename extension for MPEG-4 Part 14 files is .mp4, thus the container format is often referred to simply as MP4.
    Some devices advertised as "MP4 players" are simply MP3 players that also play AMV video and/or some other video format, and do not play the MPEG-4 part 14 format.

    History of MP4

    MPEG-4 Part 14 is based upon ISO/IEC 14496-12:2004 (MPEG-4 Part 12: ISO base media file format) which is directly based upon Apple’s QuickTime container format.[2][3][4] MPEG-4 Part 14 is essentially identical to the MOV format, but formally specifies support for Initial Object Descriptors (IOD) and other MPEG features.[5] MPEG-4 Part 14 revises and completely replaces Clause 13 of ISO/IEC 14496-1 (MPEG-4 Part 1: Systems), in which the file format for MPEG-4 content was previously specified.[6]
    The MPEG-4 file format specification was created on the basis of the QuickTime format specification published in 2001.[7] The MPEG-4 file format, version 1 was published in 2001 as ISO/IEC 14496-1:2001, which is a revision of the MPEG-4 Part 1: Systems specification published in 1999 (ISO/IEC 14496-1:1999).[8][9][10] In 2003, the first version of MP4 file format was revised and replaced by MPEG-4 Part 14: MP4 file format (ISO/IEC 14496-14:2003), commonly named as MPEG-4 file format version 2.[11] The MP4 file format was generalized into the ISO Base Media File format ISO/IEC 14496-12:2004, which defines a general structure for time-based media files. It in turn is used as the basis for other file formats in the family (for example MP4, 3GP, Motion JPEG 2000).[2][12][13]
    The MP4 file format defined some extensions over ISO Base Media File Format to support MPEG-4 visual/audio codecs and various MPEG-4 Systems features such as object descriptors and scene descriptions. Some of these extensions are also used by other formats based on ISO base media file format (e.g. 3GP).[1] List of all registered extensions for ISO Base Media File Format is published on the official registration authority website www.mp4ra.org. The registration authority for code-points (identifier values) in "MP4 Family" files is Apple Computer Inc. and it is named in Annex D (informative) in MPEG-4 Part 12.[12] Codec designers should register the codes they invent, but the registration is not mandatory[14] and some of invented and used code-points are not registered.[15] When someone is creating a new specification derived from the ISO Base Media File Format, all the existing specifications should be used both as examples and a source of definitions and technology. If an existing specification already covers how a particular media type is stored in the file format (e.g. MPEG-4 audio or video in MP4), that definition should be used and a new one should not be invented.

    .MP4 versus .M4A file extensions

    The existence of two different file extensions for naming audio-only MP4 files has been a source of confusion among users and multimedia playback software. Since MPEG-4 Part 14 is a container format, MPEG-4 files may contain any number of audio, video, and even subtitle streams, making it impossible to determine the type of streams in an MPEG-4 file based on its filename extension alone. In response, Apple Inc. started using and popularizing the .m4a file extension. Software capable of audio/video playback should recognize files with either .m4a or .mp4 file extensions, as would be expected, as there are no file format differences between the two. Most software capable of creating MPEG-4 audio will allow the user to choose the filename extension of the created MPEG-4 files.
    While the only official file extension defined by the standard is .mp4, various file extensions are commonly used to indicate intended content:
    MPEG-4 files with audio and video generally use the standard .mp4 extension.
    Audio-only MPEG-4 files generally have a .m4a extension. This is especially true of non-protected content.
    MPEG-4 files with audio streams encrypted by FairPlay Digital Rights Management as sold through the iTunes Store use the .m4p extension. iTunes Plus tracks are unencrypted and use .m4a accordingly.
    Audio book and podcast files, which also contain metadata including chapter markers, images, and hyperlinks, can use the extension .m4a, but more commonly use the .m4b extension. An .m4a audio file cannot "bookmark" (remember the last listening spot), whereas .m4b extension files can.
    The Apple iPhone uses MPEG-4 audio for its ringtones but uses the .m4r extension rather than the .m4a extension.
    Raw MPEG-4 Visual bitstreams are named .m4v but this extension is also sometimes used for video in MP4 container format.[16]
    Mobile phones use 3GP, an implementation of MPEG-4 Part 12 (a.k.a MPEG-4/JPEG2000 ISO Base Media file format), similar to MP4. It uses .3gp and .3g2 extensions. These files also store non-MPEG-4 data (H.263, AMR, TX3G).
    The common but non-standard use of the extensions .m4a and .m4v is due to the popularity of Apple’s iPod, iPhone, and iTunes Store. With modification, Nintendo's DSi and Sony's PSP can also play M4A.



    more

Subscribe

Enter your email address:

Delivered by FeedBurner

Add to Google Reader or Homepage

Subscribe in NewsGator Online

Add to My AOL

Powered by FeedBurner

Recent Post

Top Commenters

Powered by Blogger Widgets

Featured Video