Bitmap备忘

这是一篇阅读Bitmap相关内容的笔记!
程序运行环境:

Linux KNIGHT 3.0.0-28-generic #45-Ubuntu SMP Wed Nov 14 21:57:26 UTC 2012 x86_64 x86_64 x86_64 GNU/Linux

BMP图像文件被分成4个部分:位图文件头(Bitmap File Header)、位图信息头(Bitmap Info Header)、颜色表(Color Map)和位图数据(即图像数据,Data Bits或Data Body)。

这本书(http://vipbase.net/ipbook/chap01.htm)第一章节有讲这些。

biBitCount:每个像素所占的位数(bit),其值必须为1(黑白图像)、4(16色图)、8(256色)、24(真彩色图),32(带透明度alpha通道,位于前8位)。

以下这几篇博客有讲些基本的东西。
http://www.cnblogs.com/shengansong/archive/2011/09/23/2186409.html
http://blog.csdn.net/yutianzuijin/article/details/8243343
http://blog.csdn.net/scut1135/article/details/5573395

我有参考以上资料修改出一个可以在x86_64 GNU/Linux编译并运行的程序,注释里面有写些需要注意的地方。

some pitfalls of __attribute__((packed)), plz refer
http://stackoverflow.com/questions/8568432/is-gccs-attribute-packed-pragma-pack-unsafe
http://stackoverflow.com/questions/11770451/what-is-the-meaning-of-attribute-packed-aligned4
http://stackoverflow.com/questions/11667181/why-does-padding-have-to-be-a-power-of-two
http://stackoverflow.com/questions/119123/why-isnt-sizeof-for-a-struct-equal-to-the-sum-of-sizeof-of-each-member
http://en.wikipedia.org/wiki/Data_structure_alignment

Linux下BMP转化为JPEG程序源代码
http://www.linuxidc.com/Linux/2011-03/33193.htm

如果你自己要开发JPEG相关的话,确保这些包是有安装的。

sudo apt-get install libjpeg62
sudo apt-get install libjpeg62-dev

Android MediaRecorder系统结构

前面有分析过Camera的实现,现在来看看MediaRecorder的实现,这里我不会太去关注它的分层结构,我更关注它的逻辑!

APP层 /path/to/aosp/frameworks/base/media/java/android/media/MediaRecorder.java
JNI层 /path/to/aosp/frameworks/base/media/jni/android_media_MediaRecorder.cpp
调用NATIVE层的MediaRecorder(这里是BnMediaRecorderClient)
header /path/to/aosp/frameworks/av/include/media/mediarecorder.h
implementation /path/to/aosp/frameworks/av/media/libmedia/mediarecorder.cpp

MediaRecorder::MediaRecorder() : mSurfaceMediaSource(NULL)
{
    ALOGV("constructor");

    const sp<IMediaPlayerService>& service(getMediaPlayerService());
    if (service != NULL) {
        mMediaRecorder = service->createMediaRecorder(getpid());
    }
    if (mMediaRecorder != NULL) {
        mCurrentState = MEDIA_RECORDER_IDLE;
    }

    doCleanUp();
}

getMediaPlayerService()这个方法位于/path/to/aosp/frameworks/av/include/media/IMediaDeathNotifier.h

获取到MediaPlayerService(这个是BpMediaPlayerService)之后
调用IMediaPlayerService当中的

sp<IMediaRecorder> MediaPlayerService::createMediaRecorder(pid_t pid)
{
    sp<MediaRecorderClient> recorder = new MediaRecorderClient(this, pid);
    wp<MediaRecorderClient> w = recorder;
    Mutex::Autolock lock(mLock);
    mMediaRecorderClients.add(w);
    ALOGV("Create new media recorder client from pid %d", pid);
    return recorder;
}

创建MediaRecorderClient(这里是BnMediaRecorder)

但是通过binder拿到的是BpMediaRecorder
因为有如下的interface_cast过程

virtual sp<IMediaRecorder> createMediaRecorder(pid_t pid)
{
    Parcel data, reply;
    data.writeInterfaceToken(IMediaPlayerService::getInterfaceDescriptor());
    data.writeInt32(pid);
    remote()->transact(CREATE_MEDIA_RECORDER, data, &reply);
    return interface_cast<IMediaRecorder>(reply.readStrongBinder());
}

而MediaRecorderClient当中又会创建StagefrightRecorder(MediaRecorderBase),它位于
/path/to/aosp/frameworks/av/media/libmediaplayerservice/StagefrightRecorder.cpp

目前我们可以认为在APP/JNI/NATIVE这边是在一个进程当中,在MediaPlayerService当中的MediaRecorderClient/StagefrightRecorder是在另外一个进程当中,他们之间通过binder通信,而且Bp和Bn我们也都有拿到,后面我们将不再仔细区分Bp和Bn。

客户端这边
BnMediaRecorderClient
BpMediaRecorder
BpMediaPlayerService

服务端这边
BpMediaRecorderClient(如果需要通知客户端的话,它可以获得这个Bp)
BnMediaRecorder
BnMediaPlayerService

这有张图(点过去看原始大图)
Android MediaRecorder Diagram

我们以开始录影为例子,比如start()

在这里就兵分两路,一个CameraSource,一个MPEG4Writer(sp mWriter)
这两个class都位于/path/to/aosp/frameworks/av/media/libstagefright/当中

status_t StagefrightRecorder::startMPEG4Recording() {
    int32_t totalBitRate;
    status_t err = setupMPEG4Recording(
            mOutputFd, mVideoWidth, mVideoHeight,
            mVideoBitRate, &totalBitRate, &mWriter);
    if (err != OK) {
        return err;
    }

    int64_t startTimeUs = systemTime() / 1000;
    sp<MetaData> meta = new MetaData;
    setupMPEG4MetaData(startTimeUs, totalBitRate, &meta);

    err = mWriter->start(meta.get());
    if (err != OK) {
        return err;
    }

    return OK;
}
status_t StagefrightRecorder::setupMPEG4Recording(
        int outputFd,
        int32_t videoWidth, int32_t videoHeight,
        int32_t videoBitRate,
        int32_t *totalBitRate,
        sp<MediaWriter> *mediaWriter) {
    mediaWriter->clear();
    *totalBitRate = 0;
    status_t err = OK;
    sp<MediaWriter> writer = new MPEG4Writer(outputFd);

    if (mVideoSource < VIDEO_SOURCE_LIST_END) {

        sp<MediaSource> mediaSource;
        err = setupMediaSource(&mediaSource); // very important
        if (err != OK) {
            return err;
        }

        sp<MediaSource> encoder;
        err = setupVideoEncoder(mediaSource, videoBitRate, &encoder); // very important
        if (err != OK) {
            return err;
        }

        writer->addSource(encoder);
        *totalBitRate += videoBitRate;
    }

    // Audio source is added at the end if it exists.
    // This help make sure that the "recoding" sound is suppressed for
    // camcorder applications in the recorded files.
    if (!mCaptureTimeLapse && (mAudioSource != AUDIO_SOURCE_CNT)) {
        err = setupAudioEncoder(writer); // very important
        if (err != OK) return err;
        *totalBitRate += mAudioBitRate;
    }

    ...

    writer->setListener(mListener);
    *mediaWriter = writer;
    return OK;
}
// Set up the appropriate MediaSource depending on the chosen option
status_t StagefrightRecorder::setupMediaSource(
                      sp<MediaSource> *mediaSource) {
    if (mVideoSource == VIDEO_SOURCE_DEFAULT
            || mVideoSource == VIDEO_SOURCE_CAMERA) {
        sp<CameraSource> cameraSource;
        status_t err = setupCameraSource(&cameraSource);
        if (err != OK) {
            return err;
        }
        *mediaSource = cameraSource;
    } else if (mVideoSource == VIDEO_SOURCE_GRALLOC_BUFFER) {
        // If using GRAlloc buffers, setup surfacemediasource.
        // Later a handle to that will be passed
        // to the client side when queried
        status_t err = setupSurfaceMediaSource();
        if (err != OK) {
            return err;
        }
        *mediaSource = mSurfaceMediaSource;
    } else {
        return INVALID_OPERATION;
    }
    return OK;
}
status_t StagefrightRecorder::setupCameraSource(
        sp<CameraSource> *cameraSource) {
    status_t err = OK;
    if ((err = checkVideoEncoderCapabilities()) != OK) {
        return err;
    }
    Size videoSize;
    videoSize.width = mVideoWidth;
    videoSize.height = mVideoHeight;
    if (mCaptureTimeLapse) {
        if (mTimeBetweenTimeLapseFrameCaptureUs < 0) {
            ALOGE("Invalid mTimeBetweenTimeLapseFrameCaptureUs value: %lld",
                mTimeBetweenTimeLapseFrameCaptureUs);
            return BAD_VALUE;
        }

        mCameraSourceTimeLapse = CameraSourceTimeLapse::CreateFromCamera(
                mCamera, mCameraProxy, mCameraId,
                videoSize, mFrameRate, mPreviewSurface,
                mTimeBetweenTimeLapseFrameCaptureUs);
        *cameraSource = mCameraSourceTimeLapse;
    } else {
        *cameraSource = CameraSource::CreateFromCamera(
                mCamera, mCameraProxy, mCameraId, videoSize, mFrameRate,
                mPreviewSurface, true /*storeMetaDataInVideoBuffers*/);
    }
    mCamera.clear();
    mCameraProxy.clear();
    if (*cameraSource == NULL) {
        return UNKNOWN_ERROR;
    }

    if ((*cameraSource)->initCheck() != OK) {
        (*cameraSource).clear();
        *cameraSource = NULL;
        return NO_INIT;
    }

    // When frame rate is not set, the actual frame rate will be set to
    // the current frame rate being used.
    if (mFrameRate == -1) {
        int32_t frameRate = 0;
        CHECK ((*cameraSource)->getFormat()->findInt32(
                    kKeyFrameRate, &frameRate));
        ALOGI("Frame rate is not explicitly set. Use the current frame "
             "rate (%d fps)", frameRate);
        mFrameRate = frameRate;
    }

    CHECK(mFrameRate != -1);

    mIsMetaDataStoredInVideoBuffers =
        (*cameraSource)->isMetaDataStoredInVideoBuffers();

    return OK;
}
status_t StagefrightRecorder::setupVideoEncoder(
        sp<MediaSource> cameraSource,
        int32_t videoBitRate,
        sp<MediaSource> *source) {
    source->clear();

    sp<MetaData> enc_meta = new MetaData;
    enc_meta->setInt32(kKeyBitRate, videoBitRate);
    enc_meta->setInt32(kKeyFrameRate, mFrameRate);

    switch (mVideoEncoder) {
        case VIDEO_ENCODER_H263:
            enc_meta->setCString(kKeyMIMEType, MEDIA_MIMETYPE_VIDEO_H263);
            break;

        case VIDEO_ENCODER_MPEG_4_SP:
            enc_meta->setCString(kKeyMIMEType, MEDIA_MIMETYPE_VIDEO_MPEG4);
            break;

        case VIDEO_ENCODER_H264:
            enc_meta->setCString(kKeyMIMEType, MEDIA_MIMETYPE_VIDEO_AVC);
            break;

        default:
            CHECK(!"Should not be here, unsupported video encoding.");
            break;
    }

    sp<MetaData> meta = cameraSource->getFormat();

    int32_t width, height, stride, sliceHeight, colorFormat;
    CHECK(meta->findInt32(kKeyWidth, &width));
    CHECK(meta->findInt32(kKeyHeight, &height));
    CHECK(meta->findInt32(kKeyStride, &stride));
    CHECK(meta->findInt32(kKeySliceHeight, &sliceHeight));
    CHECK(meta->findInt32(kKeyColorFormat, &colorFormat));

    enc_meta->setInt32(kKeyWidth, width);
    enc_meta->setInt32(kKeyHeight, height);
    enc_meta->setInt32(kKeyIFramesInterval, mIFramesIntervalSec);
    enc_meta->setInt32(kKeyStride, stride);
    enc_meta->setInt32(kKeySliceHeight, sliceHeight);
    enc_meta->setInt32(kKeyColorFormat, colorFormat);
    if (mVideoTimeScale > 0) {
        enc_meta->setInt32(kKeyTimeScale, mVideoTimeScale);
    }
    if (mVideoEncoderProfile != -1) {
        enc_meta->setInt32(kKeyVideoProfile, mVideoEncoderProfile);
    }
    if (mVideoEncoderLevel != -1) {
        enc_meta->setInt32(kKeyVideoLevel, mVideoEncoderLevel);
    }

    OMXClient client;
    CHECK_EQ(client.connect(), (status_t)OK);

    uint32_t encoder_flags = 0;
    if (mIsMetaDataStoredInVideoBuffers) {
        encoder_flags |= OMXCodec::kStoreMetaDataInVideoBuffers;
    }

    // Do not wait for all the input buffers to become available.
    // This give timelapse video recording faster response in
    // receiving output from video encoder component.
    if (mCaptureTimeLapse) {
        encoder_flags |= OMXCodec::kOnlySubmitOneInputBufferAtOneTime;
    }

    sp<MediaSource> encoder = OMXCodec::Create(
            client.interface(), enc_meta,
            true /* createEncoder */, cameraSource,
            NULL, encoder_flags);
    if (encoder == NULL) {
        ALOGW("Failed to create the encoder");
        // When the encoder fails to be created, we need
        // release the camera source due to the camera's lock
        // and unlock mechanism.
        cameraSource->stop();
        return UNKNOWN_ERROR;
    }

    *source = encoder;

    return OK;
}

这里和OMXCodec关联起来
有一个叫media_codecs.xml的配置文件来表明设备支持哪些codec

我们录制MPEG 4的时候还会有声音,所以后面还有个setupAudioEncoder,具体的方法就不展开了,总之就是把声音也作为一个Track加入到MPEG4Writer当中去。
这里插个题外话,Google说把setupAudioEncoder放到后面是为了避免开始录影的那一个提示声音也被录制进去,但是实际发现它这样做还是会有bug,在一些设备上还是会把那声录制进去,这个遇到的都是靠APP自己来播放声音来绕过这个问题的。

另外MPEG4Writer当中有个
start(MetaData*)
启动两个方法
a) startWriterThread

启动一个thread去写

    void MPEG4Writer::threadFunc() {
        ALOGV("threadFunc");

        prctl(PR_SET_NAME, (unsigned long)"MPEG4Writer", 0, 0, 0);

        Mutex::Autolock autoLock(mLock);
        while (!mDone) {
            Chunk chunk;
            bool chunkFound = false;

            while (!mDone && !(chunkFound = findChunkToWrite(&chunk))) {
                mChunkReadyCondition.wait(mLock);
            }

            // Actual write without holding the lock in order to
            // reduce the blocking time for media track threads.
            if (chunkFound) {
                mLock.unlock();
                writeChunkToFile(&chunk);
                mLock.lock();
            }
        }

        writeAllChunks();
    }

b) startTracks

    status_t MPEG4Writer::startTracks(MetaData *params) {
        for (List<Track *>::iterator it = mTracks.begin();
             it != mTracks.end(); ++it) {
            status_t err = (*it)->start(params);

            if (err != OK) {
                for (List<Track *>::iterator it2 = mTracks.begin();
                     it2 != it; ++it2) {
                    (*it2)->stop();
                }

                return err;
            }
        }
        return OK;
    }

然后调用每个Track的start方法

    status_t MPEG4Writer::Track::start(MetaData *params) {
        ...

        initTrackingProgressStatus(params);

        ...

        status_t err = mSource->start(meta.get()); // 这里会去执行CameraSource(start),这两个是相互关联的

        ...

        pthread_create(&mThread, &attr, ThreadWrapper, this);
        return OK;
    }

    void *MPEG4Writer::Track::ThreadWrapper(void *me) {
        Track *track = static_cast<Track *>(me);

        status_t err = track->threadEntry();
        return (void *) err;
    }

通过status_t MPEG4Writer::Track::threadEntry()
是新启动另外一个thread,它里面会通过一个循环来不断读取CameraSource(read)里面的数据,CameraSource里面的数据当然是从driver返回过来的(可以参见CameraSourceListener,CameraSource用一个叫做mFrameReceived的List专门存放从driver过来的数据,如果收到数据会调用mFrameAvailableCondition.signal,若还没有开始录影,这个时候收到的数据是被丢弃的,当然MediaWriter先启动的是CameraSource的start方法,再启动写Track),然后写到文件当中。
注意:准确来说这里MPEG4Writer读取的是OMXCodec里的数据,因为数据先到CameraSource,codec负责编码之后,MPEG4Writer才负责写到文件当中!关于数据在CameraSource/OMXCodec/MPEG4Writer之间是怎么传递的,可以参见http://guoh.org/lifelog/2013/06/interaction-between-stagefright-and-codec/当中讲Buffer的传输过程。

回头再来看,Stagefright做了什么事情?我更觉得它只是一个粘合剂(glue)的用处,它工作在MediaPlayerService这一层,把MediaSource,MediaWriter,Codec以及上层的MediaRecorder绑定在一起,这应该就是它最大的作用,Google用它来替换Opencore也是符合其一贯的工程派作风(相比复杂的学术派而言,虽然Google很多东西也很复杂,但是它一般都是以尽量简单的方式来解决问题)。
让大家觉得有点不习惯的是,它把MediaRecorder放在MediaPlayerService当中,这两个看起来是对立的事情,或者某一天它们会改名字,或者是两者分开,不知道~~

当然这只是个简单的大体介绍,Codec相关的后面争取专门来分析一下!

有些细节的东西在这里没有列出,需要的话会把一些注意点列出来:

1. 时光流逝录影
CameraSource对应的就是CameraSourceTimeLapse

具体做法就是在
dataCallbackTimestamp
当中有skipCurrentFrame

当然它是用些变量来记录和计算
mTimeBetweenTimeLapseVideoFramesUs(1E6/videoFrameRate) // 两个frame之间的间隔时间
记录上一个frame的(mLastTimeLapseFrameRealTimestampUs) // 上一个frame发生的时间
然后通过frame rate计算出两个frame之间的相距离时间,中间的都透过releaseOneRecordingFrame来drop掉
也就是说driver返回的东西都不变,只是在SW这层我们自己来处理掉

关于Time-lapse相关的可以参阅
https://en.wikipedia.org/wiki/Time-lapse_photography

2. 录影当中需要用到Camera的话是通过ICameraRecordingProxy,即Camera当中的RecordingProxy(这是一个BnCameraRecordingProxy)
当透过binder,将ICameraRecordingProxy传到服务端进程之后,它就变成了Bp,如下:

case SET_CAMERA: {
    ALOGV("SET_CAMERA");
    CHECK_INTERFACE(IMediaRecorder, data, reply);
    sp<ICamera> camera = interface_cast<ICamera>(data.readStrongBinder());
    sp<ICameraRecordingProxy> proxy =
        interface_cast<ICameraRecordingProxy>(data.readStrongBinder());
    reply->writeInt32(setCamera(camera, proxy));
    return NO_ERROR;
} break;

在CameraSource当中会这样去使用

// We get the proxy from Camera, not ICamera. We need to get the proxy
// to the remote Camera owned by the application. Here mCamera is a
// local Camera object created by us. We cannot use the proxy from
// mCamera here.
mCamera = Camera::create(camera);
if (mCamera == 0) return -EBUSY;
mCameraRecordingProxy = proxy;
mCameraFlags |= FLAGS_HOT_CAMERA;

疑问点:

CameraSource当中这个
List > mFramesBeingEncoded;
有什么用?
每编码完一个frame,CameraSource就会将其保存起来,Buffer被release的时候,会反过来release掉这些frame(s),这种做法是为了效率么?为什么不编码完一个frame就将其release掉?
另外不得不再感叹下Google经常的delete this;行为,精妙,但是看起来反常!

怎样为Android添加一个系统级服务

内容均来自于网络资料,源码以及自己的理解,如有错误的地方还请指出!示例源码可以随意使用。

我这里使用的环境如下

PLATFORM_VERSION_CODENAME=AOSP
PLATFORM_VERSION=4.0.9.99.999.9999.99999
TARGET_PRODUCT=full_panda
TARGET_BUILD_VARIANT=userdebug
TARGET_BUILD_TYPE=release

Android提供了一些基础服务,但是如果你想给它添加一些特定的服务的话,也是可以的(比如在一直到某些特定的平台下,需要增加/裁剪一些服务)

比如我们都知道Android上用的最广泛的应该是多媒体服务,其对应的有一个mediaserver进程,然后比如我们的
AudioFlinger/MediaPlayerService/CameraService/AudioPolicyService
都是跑在这个进程里面的

那如果我们想自己增加一个类似的东西可不可以呢?答案当然是可以的。

很简单的几点就可以帮你完成

启动进程

/path/to/aosp/frameworks/base/customized_service/server

具体服务实现

/path/to/aosp/frameworks/base/customized_service/libcustomizedservice

系统开机启动

/path/to/aosp/system/core/rootdir/init.rc

添加如下内容到init.rc文件

service customized /system/bin/customizedserver
    class main
    user media
    group audio camera

我是用的Pandaboard
编译整个ROM,然后fastboot flashall

I/        ( 1080): ServiceManager: 0x40ce4f00
D/        ( 1080): CustomizedService instantiate
D/        ( 1080): CustomizedService created
E/ServiceManager(   94): add_service('test.customized',0x44) uid=xxxx - PERMISSION DENIED
D/        ( 1080): CustomizedService r = -1
D/        ( 1080): CustomizedService destroyed

或者你想从shell当中启动这个服务的进程也会出现类似权限不足的问题,追一下log就很快会发现问题

int do_add_service(struct binder_state *bs,
                   uint16_t *s, unsigned len,
                   void *ptr, unsigned uid, int allow_isolated)
{
    struct svcinfo *si;

    if (!ptr || (len == 0) || (len > 127))
        return -1;

    if (!svc_can_register(uid, s)) {
        ALOGE("add_service('%s',%p) uid=%d - PERMISSION DENIED\n",
             str8(s), ptr, uid);
        return -1;
    }

    si = find_svc(s, len);
 
    ......

    binder_acquire(bs, ptr);
    binder_link_to_death(bs, ptr, &si->death);
    return 0;
}
int svc_can_register(unsigned uid, uint16_t *name)
{
    unsigned n;
    
    if ((uid == 0) || (uid == AID_SYSTEM))
        return 1;

    for (n = 0; n < sizeof(allowed) / sizeof(allowed[0]); n++)
        if ((uid == allowed[n].uid) && str16eq(name, allowed[n].name))
            return 1;

    return 0;
}

可以看出能注册service的用户的uid只能是0即ROOT或者是AID_SYSTEM,或者是allowed列表当中的服务,最简单的就是把所要添加的服务加到列表当中

{ AID_MEDIA, "test.customized" },

而服务allowed列表位于

/path/to/aosp/frameworks/base/cmds/servicemanager/service_manager.c

重新编译ROM,烧完之后启动看到customizedservice这个进程起来的话,就多半表明服务加好了

App端怎么调用?

写个Activity利用android.os.ServiceManager的getService获取到指定的服务,然后调用就可以了。
这里需要说明下的就是目前android.os.ServiceManager对于app来说是hide掉的,我们暂且可以不管,用个反射先用着。
至于为什么要hide掉,难道Android不想让我们自己定义系统级别的service来使用?还是说用其他的方式可以完成,目前我也不清楚!

目前结果就是通过getService能获取到我们的服务,并且成功调用,但是因为我返回的参数有String有int,而String没有获取出来,都知道Parcel的参数顺序是很重要的,可能是这个的原因,后面再来看看到底是什么原因,目前就是把customized service这一套先跑通。

P.S. 后来一次发现可能是我每次编译的so档案没有进到ROM当中去,所以烧到设备的so档案不是最新的,就没有把后面加的String获取出来(之前是两个int型数据)。
另外我还做了个实验,就是在service端通过ServiceManager的getService(…)方法再次访问当前service,这个就相当于在当前代码中通过一个指针或者引用访问的效果是一样的,这就是Binder的强大之处吧!此改动可以参见以下commit:

commit 1875ad41d5f8e505cb4c26599e4626944005f26b
Date: Tue Jun 25 12:10:59 2013 +0800

Symptom: test invoking getService in same process with service
Issue ID: NA
Root Cause: NA
Solution: NA

完整代码参见
https://github.com/guohai/and-customized-service.git

这里是Java端调用native端的服务,其实也可以native调用native的服务,这里暂时就不再去具体实现了,需要的话,会再来写。

无图无真相
add-customized-service-for-android
add-customized-service-for-android-1

Android Camera系统结构

说到架构/构架就觉得心虚,感觉这两个字太高端了,所以我只敢用结构来表达。

以下表述如果存在错误,请指教!

之前有在一篇博客中提到Camera相关的东西
http://guoh.org/lifelog/2012/09/asop-camera-source-code-reading-video-snapshot/

比如

$ANDROID_SRC_HOME/frameworks/base/core/java/android/hardware/Camera.java
$ANDROID_SRC_HOME/frameworks/base/core/jni/android_hardware_Camera.cpp

// 注:目前以下这两个文件在JB当中应该被移动位置了,由此可见Android也是个活跃的项目

$ANDROID_SRC_HOME/frameworks/base/include/camera/Camera.h
$ANDROID_SRC_HOME/frameworks/base/libs/camera/Camera.cpp

这篇博客是阅读http://source.android.com/devices/camera.html,结合在JB代码的基础上写的笔记,可能不是那么详细和有逻辑,只关注重点和要注意的地方,所以如果你刚好读到,但是你又没有什么Android Camera的基础的话,建议还是先看看http://developer.android.com/guide/topics/media/camera.htmlReference

正式开始
大体来说Camera分为这么多部分
APP — JNI — NATIVE — BINDER — CAMERA SERVICE — HAL — DRIVER

下面列出这每一部分对应于AOSP当中的代码

APP($ANDROID_SRC_HOME/frameworks/base/core/java/android/hardware/Camera.java)

JNI($ANDROID_SRC_HOME/frameworks/base/core/jni/android_hardware_Camera.cpp)

NATIVE($ANDROID_SRC_HOME/frameworks/av/camera/Camera.cpp)

BINDER($ANDROID_SRC_HOME/frameworks/av/camera/ICameraService.cpp
       $ANDROID_SRC_HOME/frameworks/av/camera/ICameraClient.cpp)

CAMERA SERVICE($ANDROID_SRC_HOME/frameworks/av/services/camera/libcameraservice/CameraService.cpp)

HAL($ANDROID_SRC_HOME/hardware/libhardware/include/hardware/camera.h
    $ANDROID_SRC_HOME/hardware/libhardware/include/hardware/camera_common.h
    $ANDROID_SRC_HOME/frameworks/av/services/camera/libcameraservice/CameraHardwareInterface.h)

对于Galaxy Nexus来讲,HAL的具体实现位于

$ANDROID_SRC_HOME/hardware/ti/omap4xxx/camera/CameraHal.cpp
$ANDROID_SRC_HOME/hardware/ti/omap4xxx/camera/CameraHalCommon.cpp

头文件位于

$ANDROID_SRC_HOME/hardware/ti/omap4xxx/camera/inc/

对于Galaxy Nexus来讲

DRIVER(https://developers.google.com/android/nexus/drivers)

不知道看了对Camera有没有个大体的认识?
大致了解了的话,就翻代码出来看看自己感兴趣的部分吧

你可以通过repo拉取所有代码到本地看,也可以在线查看。
android/hardware/Camera.java

public static Camera open(int cameraId) {
    return new Camera(cameraId);
}
Camera(int cameraId) {
    mShutterCallback = null;
    mRawImageCallback = null;
    mJpegCallback = null;
    mPreviewCallback = null;
    mPostviewCallback = null;
    mZoomListener = null;

    Looper looper;
    if ((looper = Looper.myLooper()) != null) {
        mEventHandler = new EventHandler(this, looper);
    } else if ((looper = Looper.getMainLooper()) != null) {
        mEventHandler = new EventHandler(this, looper);
    } else {
        mEventHandler = null;
    }

    native_setup(new WeakReference<Camera>(this), cameraId);
}

以上代码描述了我们所要研究的东西的基础,开启camera,跟所有的硬件一样,使用之前需要先开启。
这里有个需要注意的地方就是mEventHandler,这个变量是做什么用的,文档有解释的比较清楚,如下:

Callbacks from other methods are delivered to the event loop of the
thread which called open(). If this thread has no event loop, then
callbacks are delivered to the main application event loop. If there
is no main application event loop, callbacks are not delivered.

如果你不是写一个玩具程序的话,你就应当注意这里的描述。

随着native_setup(new WeakReference(this), cameraId);,代码走到了
android_hardware_Camera.cpp

// connect to camera service
static void android_hardware_Camera_native_setup(JNIEnv *env, jobject thiz,
    jobject weak_this, jint cameraId)
{
    sp<Camera> camera = Camera::connect(cameraId);

    if (camera == NULL) {
        jniThrowRuntimeException(env, "Fail to connect to camera service");
        return;
    }

    // make sure camera hardware is alive
    if (camera->getStatus() != NO_ERROR) {
        jniThrowRuntimeException(env, "Camera initialization failed");
        return;
    }

    jclass clazz = env->GetObjectClass(thiz);
    if (clazz == NULL) {
        jniThrowRuntimeException(env, "Can't find android/hardware/Camera");
        return;
    }

    // We use a weak reference so the Camera object can be garbage collected.
    // The reference is only used as a proxy for callbacks.
    sp<JNICameraContext> context = new JNICameraContext(env, weak_this, clazz, camera);
    context->incStrong(thiz);
    camera->setListener(context);

    // save context in opaque field
    env->SetIntField(thiz, fields.context, (int)context.get());
}

随着sp camera = Camera::connect(cameraId);,代码走到了
av/include/camera/Camera.h,当然实现都在av/camera/Camera.cpp

sp<Camera> Camera::connect(int cameraId)
{
    ALOGV("connect");
    sp<Camera> c = new Camera();
    const sp<ICameraService>& cs = getCameraService();
    if (cs != 0) {
        c->mCamera = cs->connect(c, cameraId);
    }
    if (c->mCamera != 0) {
        c->mCamera->asBinder()->linkToDeath(c);
        c->mStatus = NO_ERROR;
    } else {
        c.clear();
    }
    return c;
}

这里getCameraService()是一个比较关键的东西,需要仔细看看。

// establish binder interface to camera service
const sp<ICameraService>& Camera::getCameraService()
{
    Mutex::Autolock _l(mLock);
    if (mCameraService.get() == 0) {
        sp<IServiceManager> sm = defaultServiceManager();
        sp<IBinder> binder;
        do {
            binder = sm->getService(String16("media.camera"));
            if (binder != 0)
                break;
            ALOGW("CameraService not published, waiting...");
            usleep(500000); // 0.5 s
        } while(true);
        if (mDeathNotifier == NULL) {
            mDeathNotifier = new DeathNotifier();
        }
        binder->linkToDeath(mDeathNotifier);
        mCameraService = interface_cast<ICameraService>(binder);
    }
    ALOGE_IF(mCameraService==0, "no CameraService!?");
    return mCameraService;
}

然后就是av/camera/ICameraService.cpp当中代理类BpCameraService

// connect to camera service
virtual sp<ICamera> connect(const sp<ICameraClient>& cameraClient, int cameraId)
{
    Parcel data, reply;
    data.writeInterfaceToken(ICameraService::getInterfaceDescriptor());
    data.writeStrongBinder(cameraClient->asBinder());
    data.writeInt32(cameraId);
    remote()->transact(BnCameraService::CONNECT, data, &reply);
    return interface_cast<ICamera>(reply.readStrongBinder());
}

最终转移到BnCameraServiceconnect方法,但是这是个纯虚函数,而且CameraService继承了BnCameraService,所以请看
av/services/camera/libcameraservice/CameraService.h,当然具体实现还是在av/services/camera/libcameraservice/CameraService.cpp当中。

sp<ICamera> CameraService::connect(
        const sp<ICameraClient>& cameraClient, int cameraId) {
    int callingPid = getCallingPid();

    LOG1("CameraService::connect E (pid %d, id %d)", callingPid, cameraId);

    if (!mModule) {
        ALOGE("Camera HAL module not loaded");
        return NULL;
    }

    sp<Client> client;
    if (cameraId < 0 || cameraId >= mNumberOfCameras) {
        ALOGE("CameraService::connect X (pid %d) rejected (invalid cameraId %d).",
            callingPid, cameraId);
        return NULL;
    }

    char value[PROPERTY_VALUE_MAX];
    property_get("sys.secpolicy.camera.disabled", value, "0");
    if (strcmp(value, "1") == 0) {
        // Camera is disabled by DevicePolicyManager.
        ALOGI("Camera is disabled. connect X (pid %d) rejected", callingPid);
        return NULL;
    }

    Mutex::Autolock lock(mServiceLock);
    if (mClient[cameraId] != 0) {
        client = mClient[cameraId].promote();
        if (client != 0) {
            if (cameraClient->asBinder() == client->getCameraClient()->asBinder()) {
                LOG1("CameraService::connect X (pid %d) (the same client)",
                     callingPid);
                return client;
            } else {
                ALOGW("CameraService::connect X (pid %d) rejected (existing client).",
                      callingPid);
                return NULL;
            }
        }
        mClient[cameraId].clear();
    }

    if (mBusy[cameraId]) {
        ALOGW("CameraService::connect X (pid %d) rejected"
                " (camera %d is still busy).", callingPid, cameraId);
        return NULL;
    }

    struct camera_info info;
    if (mModule->get_camera_info(cameraId, &info) != OK) {
        ALOGE("Invalid camera id %d", cameraId);
        return NULL;
    }

    int deviceVersion;
    if (mModule->common.module_api_version == CAMERA_MODULE_API_VERSION_2_0) {
        deviceVersion = info.device_version;
    } else {
        deviceVersion = CAMERA_DEVICE_API_VERSION_1_0;
    }

    switch(deviceVersion) {
      case CAMERA_DEVICE_API_VERSION_1_0:
        client = new CameraClient(this, cameraClient, cameraId,
                info.facing, callingPid, getpid());
        break;
      case CAMERA_DEVICE_API_VERSION_2_0:
        client = new Camera2Client(this, cameraClient, cameraId,
                info.facing, callingPid, getpid());
        break;
      default:
        ALOGE("Unknown camera device HAL version: %d", deviceVersion);
        return NULL;
    }

    if (client->initialize(mModule) != OK) {
        return NULL;
    }

    cameraClient->asBinder()->linkToDeath(this);

    mClient[cameraId] = client;
    LOG1("CameraService::connect X (id %d, this pid is %d)", cameraId, getpid());
    return client;
}

比如我们这里以HAL 1.0来看,参见
av/services/camera/libcameraservice/CameraClient.h

av/services/camera/libcameraservice/CameraClient.cpp

CameraClient::CameraClient(const sp<CameraService>& cameraService,
        const sp<ICameraClient>& cameraClient,
        int cameraId, int cameraFacing, int clientPid, int servicePid):
        Client(cameraService, cameraClient,
                cameraId, cameraFacing, clientPid, servicePid)
{
    int callingPid = getCallingPid();
    LOG1("CameraClient::CameraClient E (pid %d, id %d)", callingPid, cameraId);

    mHardware = NULL;
    mMsgEnabled = 0;
    mSurface = 0;
    mPreviewWindow = 0;
    mDestructionStarted = false;

    // Callback is disabled by default
    mPreviewCallbackFlag = CAMERA_FRAME_CALLBACK_FLAG_NOOP;
    mOrientation = getOrientation(0, mCameraFacing == CAMERA_FACING_FRONT);
    mPlayShutterSound = true;
    LOG1("CameraClient::CameraClient X (pid %d, id %d)", callingPid, cameraId);
}
status_t CameraClient::initialize(camera_module_t *module) {
    int callingPid = getCallingPid();
    LOG1("CameraClient::initialize E (pid %d, id %d)", callingPid, mCameraId);

    char camera_device_name[10];
    status_t res;
    snprintf(camera_device_name, sizeof(camera_device_name), "%d", mCameraId);

    mHardware = new CameraHardwareInterface(camera_device_name);
    res = mHardware->initialize(&module->common);
    if (res != OK) {
        ALOGE("%s: Camera %d: unable to initialize device: %s (%d)",
                __FUNCTION__, mCameraId, strerror(-res), res);
        mHardware.clear();
        return NO_INIT;
    }

    mHardware->setCallbacks(notifyCallback,
            dataCallback,
            dataCallbackTimestamp,
            (void *)mCameraId);

    // Enable zoom, error, focus, and metadata messages by default
    enableMsgType(CAMERA_MSG_ERROR | CAMERA_MSG_ZOOM | CAMERA_MSG_FOCUS |
                  CAMERA_MSG_PREVIEW_METADATA | CAMERA_MSG_FOCUS_MOVE);

    LOG1("CameraClient::initialize X (pid %d, id %d)", callingPid, mCameraId);
    return OK;
}

当然还会用到
libhardware//include/hardware/camera.h

libhardware//include/hardware/camera_common.h

av/services/camera/libcameraservice/CameraHardwareInterface.h

status_t initialize(hw_module_t *module)
{
    ALOGI("Opening camera %s", mName.string());
    int rc = module->methods->open(module, mName.string(),
                                   (hw_device_t **)&mDevice);
    if (rc != OK) {
        ALOGE("Could not open camera %s: %d", mName.string(), rc);
        return rc;
    }
    initHalPreviewWindow();
    return rc;
}

这只是一个接口,具体的实现是厂商来做的啦

以上是从上层调用到底层,接着我们还会看一个从底层调用到上层的情况。

从代码看出有绑定3个callback到driver

mHardware->setCallbacks(notifyCallback,
        dataCallback,
        dataCallbackTimestamp,
        (void *)mCameraId);

所以这三个callback会有机会被call到
以dataCallback为例子,其中有一个是

// picture callback - compressed picture ready
void CameraClient::handleCompressedPicture(const sp<IMemory>& mem) {
    disableMsgType(CAMERA_MSG_COMPRESSED_IMAGE);

    sp<ICameraClient> c = mCameraClient;
    mLock.unlock();
    if (c != 0) {
        c->dataCallback(CAMERA_MSG_COMPRESSED_IMAGE, mem, NULL);
    }
}

通过ICameraClient调用会app,这里的ICameraClient是BpCameraClient(为什么是Bp,后面会稍加解释)

// generic data callback from camera service to app with image data
void BpCameraClient::dataCallback(int32_t msgType, const sp<IMemory>& imageData,
                  camera_frame_metadata_t *metadata)
{
    ALOGV("dataCallback");
    Parcel data, reply;
    data.writeInterfaceToken(ICameraClient::getInterfaceDescriptor());
    data.writeInt32(msgType);
    data.writeStrongBinder(imageData->asBinder());
    if (metadata) {
        data.writeInt32(metadata->number_of_faces);
        data.write(metadata->faces, sizeof(camera_face_t) * metadata->number_of_faces);
    }
    remote()->transact(DATA_CALLBACK, data, &reply, IBinder::FLAG_ONEWAY);
}

于是

status_t BnCameraClient::onTransact(
    uint32_t code, const Parcel& data, Parcel* reply, uint32_t flags)
{
    switch(code) {
        case NOTIFY_CALLBACK: {
            ...
        } break;
        case DATA_CALLBACK: {
            ALOGV("DATA_CALLBACK");
            CHECK_INTERFACE(ICameraClient, data, reply);
            int32_t msgType = data.readInt32();
            sp<IMemory> imageData = interface_cast<IMemory>(data.readStrongBinder());
            camera_frame_metadata_t *metadata = NULL;
            if (data.dataAvail() > 0) {
                metadata = new camera_frame_metadata_t;
                metadata->number_of_faces = data.readInt32();
                metadata->faces = (camera_face_t *) data.readInplace(
                        sizeof(camera_face_t) * metadata->number_of_faces);
            }
            dataCallback(msgType, imageData, metadata); // 因为pure virtual function的关系,所以会call到Camera当中的实现
            if (metadata) delete metadata;
            return NO_ERROR;
        } break;
        case DATA_CALLBACK_TIMESTAMP: {
            ...
        } break;
        default:
            return BBinder::onTransact(code, data, reply, flags);
    }
}

于是

// callback from camera service when frame or image is ready
void Camera::dataCallback(int32_t msgType, const sp<IMemory>& dataPtr,
                          camera_frame_metadata_t *metadata)
{
    sp<CameraListener> listener;
    {
        Mutex::Autolock _l(mLock);
        listener = mListener;
    }
    if (listener != NULL) {
        listener->postData(msgType, dataPtr, metadata);
    }
}

从app call到service是BnCameraClient(client是实体,service是代理,从new的代码就可以观察出来)
从service call到app是BpCameraClient(service是实体,client是代理)
而这个Bn到Bp的转换是通过

sp<ICameraClient> cameraClient = interface_cast<ICameraClient>(data.readStrongBinder());

完成的。

后面我们会列举些需要注意的点:

ICameraClient继承自IInterface
BnCameraClient继承自BnInterface,然而BnInterface又继承自ICameraClient,BBinder

Camera继承自BnCameraClient

ICameraService继承自IInterface
BnCameraService继承自BnInterface,然而BnInterface又继承自ICameraService,BBinder

ICamera继承自IInterface
BnCamera继承自BnInterface,然而BnInterface又继承自ICamera,BBinder

CameraService继承自BinderService,BnCameraService,IBinder::DeathRecipient
而BinderService主要是用来实例化service的统一接口

CameraService有个嵌套类Client继承自BnCamera
真正的实现大部分在CameraClient当中(Camera2Client是Google提供的调用新的HAL的一种方式,部分厂商会用这样的方式来实现自己的HAL,比如Qualcomm)
这才是真正的client,直接调用HW或者被HW调用。

sendCommand往driver送参数的时候会先检验是不是CameraService这层可以处理的东西(有些东西不需要到driver的)

cmd == CAMERA_CMD_SET_DISPLAY_ORIENTATION
cmd == CAMERA_CMD_ENABLE_SHUTTER_SOUND
cmd == CAMERA_CMD_PLAY_RECORDING_SOUND
cmd == CAMERA_CMD_SET_VIDEO_BUFFER_COUNT
cmd == CAMERA_CMD_PING
以上这些都CameraService自己处理了
其他均丢给driver(mHardware->sendCommand(cmd, arg1, arg2);)

接下来是一些疑问点:

// these are initialized in the constructor.
sp mHardware; // cleared after disconnect()
mHardware在disconnect当中被clear

// Make sure disconnect() is done once and once only, whether it is called
// from the user directly, or called by the destructor.
if (mHardware == 0) return;
这是防止disconnect调用一次之后,析构函数又调用一次。

在Linux上开启Core dump来调试

这是一篇原来使用Core dump的记录,整理资料的时候看到的,没有深入的分析,只是用法

基本知识不清楚的话,请在网络上搜寻查阅
http://en.wikipedia.org/wiki/Core_dump
http://en.wikipedia.org/wiki/GNU_Debugger

guohai@KNIGHT:~$ ulimit -c
0
guohai@KNIGHT:~$ ulimit -c unlimited
guohai@KNIGHT:~$ ulimit -c
unlimited
guohai@KNIGHT:~$ ./a.out 
Floating point exception (core dumped)

a.out是需要分析的程序,以上命令就是Linux上使用方法,很简单

更多情况请参考
http://www.cppblog.com/kongque/archive/2011/03/07/141262.aspx

那么在Android上怎么开启呢(首先得有root权限)?

$ adb remount
$ adb shell
root@android:/ # ulimit -c                                                     
unlimited

更改Core dump档案存储的路径(这个存储的路径可以根据需要定制)

root@android:/ # echo "/data/coredump/%p_%e" > /proc/sys/kernel/core_pattern

这样当有native crash存在的时候就会出现对应的Core dump档案了
(有时候执行没有生成Core dump,因为没有/data/coredump文件夹也可能导致无法生成Core dump,大概是没有权限创建)

然后就把档案拷贝到宿主机上,用GDB去载入档案,分析出错的原因

———–EOF———–

Dalvik标记贴

https://android.googlesource.com/platform/dalvik.git
Dalvik

https://android.googlesource.com/platform/dalvik.git/+/master/docs/
Dalvik Docs

http://elinux.org/Android_Dalvik_VM

http://en.wikipedia.org/wiki/Dalvik_(software)

Dalvik, Android’s virtual machine, generates significant debate

DEX文件格式分析

http://hi.baidu.com/seucrcr/item/2b988c570cb63c9208be1778
【翻译】Dalvik——如何控制vm

http://hllvm.group.iteye.com/group/topic/17798
[资料] Dalvik VM的JIT编译器的资料堆积(dumping…work in progress)

Run Dalvik on X86

很早以前就想过自己编译一个X86上的Dalvik出来玩玩,尝试之后没有成功,就放下了,最近几个月来又想起来搞搞,按照/PATH/TO/AOSP/dalvik/docs/hello-world.html来编译dalvikvm,创建dex档案,设置各种环境变量之后,运行出现错误

E/dalvikvm(10105): execv '/home/guohai/dev/src/android/git/aosp/out/target/product/generic_x86/system/bin/dexopt' failed: No such file or directory
W/dalvikvm(10102): DexOpt: --- END 'Foo.jar' --- status=0x0100, process failed
E/dalvikvm(10102): Unable to extract+optimize DEX from 'Foo.jar'
Dalvik VM unable to locate class 'Foo'
W/dalvikvm(10102): threadid=1: thread exiting with uncaught exception (group=0xf5a1d9c8)
java.lang.NoClassDefFoundError: Foo
    at dalvik.system.NativeStart.main(Native Method)
Caused by: java.lang.ClassNotFoundException: Didn't find class "Foo" on path: DexPathList[[zip file "Foo.jar"],nativeLibraryDirectories=[/home/guohai/dev/src/android/git/aosp/out/target/product/generic_x86/system/lib]]
    at dalvik.system.BaseDexClassLoader.findClass(BaseDexClassLoader.java:53)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:501)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:461)
    ... 1 more

在网络上搜了很久没有出现过这种错误现象的,实在无赖提了个问题
在StackOverflow上,一个星期过去了,还是无人问津

最终请教了一个做Dalvik相关工作的同事,在他的帮助之下解决了问题

原来是我的ANDROID_ROOT写的有问题,可能是我没有理解好/PATH/TO/AOSP/dalvik/docs/hello-world.html的意思,也或者是它的内容比较陈旧。
后来改了就可以正常运行了,我改过的rund请参见https://gist.github.com/guohai/5048153

现在描述下主要过程
1. Download AOSP source code

2. Compile dalvik vm on Ubuntu 11.10 X64

source build/envsetup.sh
lunch 2 # choose full_x86-eng, because we want to run it on X86 directly
make dalvikvm core dexopt ext framework android.policy services

3. Compile Java code and package it to dex.

4. Run & enjoy it.

guohai@KNIGHT:~/dev/src/android/git/aosp$ ./rund -cp Foo.jar Foo
I/dalvikvm( 3473): DexOpt: mismatch dep name: '/home/guohai/dev/src/android/git/aosp/out/target/product/generic_x86/system/framework/core.odex' vs. '/system/framework/core.odex'
E/dalvikvm( 3473): /home/guohai/dev/src/android/git/aosp/out/target/product/generic_x86/system/framework/ext.jar odex has stale dependencies
I/dalvikvm( 3473): Zip is good, but no classes.dex inside, and no valid .odex file in the same directory
I/dalvikvm( 3473): DexOpt: mismatch dep name: '/home/guohai/dev/src/android/git/aosp/out/target/product/generic_x86/system/framework/core.odex' vs. '/system/framework/core.odex'
E/dalvikvm( 3473): /home/guohai/dev/src/android/git/aosp/out/target/product/generic_x86/system/framework/framework.jar odex has stale dependencies
I/dalvikvm( 3473): Zip is good, but no classes.dex inside, and no valid .odex file in the same directory
I/dalvikvm( 3473): DexOpt: mismatch dep name: '/home/guohai/dev/src/android/git/aosp/out/target/product/generic_x86/system/framework/core.odex' vs. '/system/framework/core.odex'
E/dalvikvm( 3473): /home/guohai/dev/src/android/git/aosp/out/target/product/generic_x86/system/framework/android.policy.jar odex has stale dependencies
I/dalvikvm( 3473): Zip is good, but no classes.dex inside, and no valid .odex file in the same directory
I/dalvikvm( 3473): DexOpt: mismatch dep name: '/home/guohai/dev/src/android/git/aosp/out/target/product/generic_x86/system/framework/core.odex' vs. '/system/framework/core.odex'
E/dalvikvm( 3473): /home/guohai/dev/src/android/git/aosp/out/target/product/generic_x86/system/framework/services.jar odex has stale dependencies
I/dalvikvm( 3473): Zip is good, but no classes.dex inside, and no valid .odex file in the same directory
I/dalvikvm( 3473): DexOpt: source file mod time mismatch (4244043d vs 425baf6a)
Hello, Dalvik!

一些对我有些帮助的资料
http://stackoverflow.com/questions/6146983/helloworld-cannot-run-under-dalvikvm
http://stackoverflow.com/questions/11773506/how-to-launch-jar-with-exec-app-process-on-android-ics
http://blog.csdn.net/johnnynuaa/article/details/6543425

附上自己无法运行的启动脚本

#!/bin/sh

# base directory, at top of source tree; replace with absolute path
base=`pwd`

# configure root dir of interesting stuff
root=$base/out/target/product/generic_x86/system
export ANDROID_ROOT=$root #为什么这里不能指到target的路径?

export LD_LIBRARY_PATH=$root/lib:$LD_LIBRARY_PATH

# configure bootclasspath
bootpath=$root/framework
export BOOTCLASSPATH=$bootpath/core.jar:$bootpath/ext.jar:$bootpath/framework.jar:$bootpath/android.policy.jar:$bootpath/services.jar

# this is where we create the dalvik-cache directory; make sure it exists
export ANDROID_DATA=/tmp/dalvik_$USER
mkdir -p $ANDROID_DATA/dalvik-cache

exec $base/out/host/linux-x86/bin/dalvikvm -Xdexopt:none $@

[菜鸟学C++]拷贝构造函数和赋值运算符

什么是拷贝构造函数,什么时候要用,什么时候需要禁止
http://blog.csdn.net/arssqz/article/details/6361986

为什么要禁止,怎样禁止拷贝构造函数
http://jrdodds.blogs.com/blog/2004/04/disallowing_cop.html
http://c2.com/cgi/wiki?YouArentGonnaNeedIt

编译器存在哪些陷阱
http://www.cnblogs.com/yfanqiu/archive/2012/05/08/2489905.html

Google和Qt的的宏的区别
http://stackoverflow.com/questions/1454407/macros-to-disallow-class-copy-and-assignment-google-vs-qt

典型的Google宏的写法
http://src.chromium.org/svn/trunk/src/third_party/cld/base/macros.h

// A macro to disallow the copy constructor and operator= functions
// This should be used in the private: declarations for a class
//
// For disallowing only assign or copy, write the code directly, but declare
// the intend in a comment, for example:
// void operator=(const TypeName&);  // DISALLOW_ASSIGN
// Note, that most uses of DISALLOW_ASSIGN and DISALLOW_COPY are broken
// semantically, one should either use disallow both or neither. Try to
// avoid these in new code.
#define DISALLOW_COPY_AND_ASSIGN(TypeName) \
  TypeName(const TypeName&);               \
  void operator=(const TypeName&)

主要记录自己碰到的一个问题,编译器显示:

AbstractClass.h:20:12: cannot declare parameter ‘<anonymous>’ to be of abstract type ‘AbstractClass’

其实对于一个比较有经验的人或者理论知识扎实的人,看到这行信息就应该想象到是出现了类似这样的错误

void im_an_error_method(AbstractClass )
{
     ac.im();
 }

参数没有变量名,参数类型是一个抽象类(无法实例化)

可惜我无法确定,因为这个头文件不是我写的,没有足够的证据去指出别人错误的地方。
所以我只能尝试检查自己的代码,结果发现错误的行数是他头文件的

DISALLOW_COPY_AND_ASSIGN(AbstractClass);

这样一行,很奇怪,按道理应该这么写是不应该出错的,因为大家都是这么写的。
于是我尝试将这行注释掉,或者将这行以代码的形式手动展开编译,即这样

AbstractClass(const AbstractClass&);
void operator=(const AbstractClass&);

发现也可以编译通过
想想到这里应该可以看明白为什么编译器错误提示当中有个’anonymous’错误了吧。

最终原因定位在macro有写错,所以在极端情况下就出编译错误了。

最后借用http://www.artima.com/cppsource/bigtwo.html的一个例子来回顾下

class Example {
  SomeResource* p_;
public:
  Example() : p_(new SomeResource()) {}
  ~Example() {
      delete p_;
   }
};

这例子有错误吗?有几处错误?

[菜鸟学C++]抽象类

菜鸟学C++
http://stackoverflow.com/questions/388242/the-definitive-c-book-guide-and-list
理论不懂就实践,实践不行就理论

AbstractClass.h

#ifndef ABSTRACTCLASS_H_
#define ABSTRACTCLASS_H_

class AbstractClass
{
  public:
    AbstractClass();
    virtual ~AbstractClass();
  
  public:
    virtual void im() = 0;
};
#endif /* ABSTRACTCLASS_H_ */

AbstractClass.cc

#include "AbstractClass.h"

AbstractClass::AbstractClass()
{
}

AbstractClass::~AbstractClass()
{
}

void AbstractClass::im()
{}

Main.cc

#include <iostream>

#include "AbstractClass.h"

using namespace std;

class ConcreteClass : public AbstractClass
{
  public:
    ConcreteClass() {};
    virtual ~ConcreteClass() {};

  public:
    virtual void im()
    {
        cout << "ConcreteClass::im" << endl;
    };
};

void hello(AbstractClass *ac)
{
    ac->im();
}

int main(int argc, char* argv[])
{
    AbstractClass *pAC = new ConcreteClass();
    hello(pAC);
    delete pAC;
    pAC = NULL;
    return 0;
}
~/dev/workspace/labs/ccpp/cc_abstract_class$ g++ Main.cc 
/tmp/ccQrfLyb.o: In function `ConcreteClass::ConcreteClass()':
Main.cc:(.text._ZN13ConcreteClassC2Ev[_ZN13ConcreteClassC5Ev]+0x14): undefined reference to `AbstractClass::AbstractClass()'
/tmp/ccQrfLyb.o: In function `ConcreteClass::~ConcreteClass()':
Main.cc:(.text._ZN13ConcreteClassD2Ev[_ZN13ConcreteClassD5Ev]+0x1f): undefined reference to `AbstractClass::~AbstractClass()'
/tmp/ccQrfLyb.o:(.rodata._ZTI13ConcreteClass[typeinfo for ConcreteClass]+0x10): undefined reference to `typeinfo for AbstractClass'
collect2: ld returned 1 exit status

然后这样编译就可以了

~/dev/workspace/labs/ccpp/cc_abstract_class$ g++ Main.cc AbstractClass.cc

小结二零一二

这一年最深的体会就是过的很快,一年都在忙碌中读过,忙工作[占了绝大部分的时间],忙自己的事情,周末很少有两天能无所事事的。技术上没有什么太大的长进,想学的东西也没有怎么学,所以说起来今年不算满意。
时常在遇到路人,看他/她们在路边摊,在商场,在餐厅等等地方总能很开心的样子,为什么自己却没有这种心情,不是说自己觉得生活不开心,只是没有觉得高兴的东西,觉得这都似乎很平常。这一年感觉自己有些变化的就是平时说话的样子,非正式时间场合的讲话总能那么随意,不靠谱。以前我一直觉得自己没有啥脾气,不管自己还是和他人合作事情,都似乎总是那么温和,但是今年我发现自己有时候还是会“怒”。我公司老大跟我讲,不要总当老好人,该说NO的时候就要说,这个是我应该学习的地方。
在现在这家公司做了也超过一年了,觉得马马虎虎,可能公司的氛围不大适合我,公司重在员工之间的相互challenge,重在用bug来推动工作的进行[bug不一定要解的,都想先推给别人或者不解,实在是不行了才解],估计很少有人真心的想把这某个东西做好[这不是我自大或者说公司的坏话],不过这也可以理解,公司大了,相互合作的人多了,都有自己的想法。公司有位比较资深的工程师的签名大意就是“这间公司现在很少有人会去考虑一个软件的架构”,我也拜读过一位已离职的工程师2009年写的关于公司问题点的文章,到目前为止个人感觉都还基本适合,不过想想这应该是高级管理层应该去思考的问题吧。(毛病又犯了,又开始挑问题了 T_T,我想我应当才是完美主义者吧,哈哈哈)
今年书到是读了几本,但是不多,技术书籍没有很认真去读。有一本书印象比较深[当然不是技术书籍啦],打工旅行,不知道什么时候能放下所有的东西,出去走走?
明年争取能成为一位Linux下稍微熟练的C/C++码农[折腾了五六年的Java,现在来看C++也不是很难,只是不习惯,觉得不好用,常常怀疑自己会不会成为C++没学好,Java也忘记的那种人],学习Dalvik的基本知识[都想学了N久了,还是没入门]。
多关心下老爸,老妈,他/她们真的开始老了。。。