how to link dynamic library?any video library for lnik?

I have built ffmpeg and generate a file,but I dnono how to to link it,
instead I built static library *.a to link,but it show a lot undefined reference error,
if fmpeg library could be integrated into TADP,it will be very helpful for game dev,although opencv is integrated into TADP but why there is no any sample could be open by visual studio?

another question…signing options
how signing options work?
it used to be work fine,but now it show error
Failed to create a new key
then open signing options again it show
Set property 'System.Windows.Controls.Primitives.ToggleButton.IsChecked

by the way
link static library(ffmpeg) error show undefined reference seems could be solved by setting this option.

thanks for any help

hi FatmingWang2,
You could add these via Project properties>Linker in Nsight Tegra.
add Lib path in Linker>General>Additional LibraryDirectories
add Lib name in Linker>Input>AdditionalDependencies

hello Victorli I had add the library as well,but the problem is there are still some unreferenced function need to be linked(those function wont be call),so I have to ignore undefined reference ,this the answer I googled,so the LOCAL_ALLOW_UNDEFINED_SYMBOL, is this flag available in visual studio Tegra? Because FFmpeg is a big project to me,it seems too hard to make a Vs Tegra version in a short time.

there are tutorial but link as so,
and I did add the necessary so at
Properties->Ant Build->Additional Dependencies->Native Library Directories and dependencies
but it still not working :(

sorry this is hand fault click

Hi FatmingWang2,

You can try setting the Linker -> Advanced -> Report Undefined Symbols to No - this property controls the same linker flag as the LOCAL_ALLOW_UNDEFINED_SYMBOLS in ndk-build. Hope that helps!

Hello DmitryPolyanitsa

I have set Linker -> Advanced -> Report Undefined Symbols to No,
but it still looking for the referenced and make timeout happen,
I did’t call any ffmpeg function just try to test link library,and it show the message

05-05 21:15:23.865 10029 10029 E dalvikvm: dlopen("/data/app-lib/com.nvidia.devtech.BluffingGirl-1/") failed: Cannot load library: soinfo_relocate(linker.cpp:975): cannot locate symbol “avpriv_aac_parse_header” referenced by “”…
05-05 21:16:23.721 868 893 W ActivityManager: Activity pause timeout for ActivityRecord{41c9aa40 u0 com.nvidia.devtech.BluffingGirl/.BluffingGirl}

I will test load so(ffmpeg) files in java side later

thanks for the help!

That is for compile time but you are showing logcat which is runtime.
You probably want to use flags on dlopen. RTLD_LAZY

Hey dhorowitz

I want make a game could interact with video,so I have to decode video frame in real time with vertex animation and terrain.
here is the video what I want to do.

I have built ffmpeg follow the instruction from
and I built both static library and shared library,

now I got error message

where could I set dhorowitz flag?

I have spent weeks trying to figure out how to make it work,but no progress at all…
are there any other 3rd party video decode library could use directly in Tegra Visual Studio?
thanks for any help :)

I have google a similar one

but #pragma comment seems not work :(

Hi FatmingWang2,

You’ll need to use dlopen to be able to pass the RTLD_LAZY flag. Here’s it’s Linux man page:

Hello Dmitry

Finally I knew why library can’t link well,and it’s so unbelievable,because the library are not COMPILED by Tegra Visual Sutdio,it won’t be work…
I try a easiest project o link with opencv which is provided by TADP(C:\NVPACK\OpenCV-2.4.5-Tegra-sdk-r2\sdk\native),
and it still not working :(
here is my test project,I have linked all opencv library and it just not working

Now I am using parallel threads(over 3 threads become slower than 2) to decode jpeg and render it,but fps only got around 15(cpu Tegra3)

Now I am begging if it’s possible…any HW decoder library is available for Tegra Visual Studio?
or use Eclipes to do such things will be beter?

Stepping back for just a minute, I’d wonder if perhaps you should skip native code entirely. Android includes the ability via SurfaceTexture to HW decode video into a GLES texture. There would be two options there: 1) if the 3D code is not overly complex, do all of the rendering in Java. Otherwise, you could potentially use GLSurfaceView for your GL context setup, so the context is available in Java, but do your rendering in native code. Simply pass down the texture ID of the SurfaceTexture to native code and render it there.

Perhaps using such a mixed-mode Java/native system could allow you to use the existing Android classes that give you video-to-texture via hardware without having to resort to lots of native libraries?

Alternatively, if you need a native method, look into OpenMAX AL in the Android NDK - this also should allow video decode to a SurfaceTexture. That could allow for pure-native HW video to texture decode.

Might these alternatives give you what you need without third-party libraries?

Hello lbishop
the problem is I have to manipulate the pixel data from video,I have to mix terrain and video frame
make it look like a 3D model video,
Android has provide the HW decoder(openmax,stagefright) but protect all the pixel data and do glTextxxx function in private function,
I have check native-media in ndk sameple
I had try theNativeWindow = ANativeWindow_fromSurface(env, surface);
and fetch pixel data from surface but it failed,(OnDrawFrame won’t be called if setContentView is not set,and I do not expect do setContentView)opencv is another easy way to get pixel data,but link is not working.
I also trying compile libjpeg turbo but compile problem made me down lol…
now I retrieved all video frame in PC(opencv) and save each frame into JPEG,
it seems work now but fps is bad,even I use parallel thread and down sample video to 640x360,
the fps is around 20 but on dual core Android device only get 9…
anyway thanks for great help I will take a look at libjepg turo source code and try to compile it.

Seems a shame that the video frame processing could not also be done on the GPU. Is the CPU processing of the video data such that you could not use a fragment shader to do it? By keeping all of the pixel-processing on the GPU, you could avoid both the CPU work and the overhead of uploading the processed video frames to the GPU.

What is the nature of the per-pixel processing you need to do to the video? I recall in the early days of Tegra when we were working on examples of video processing, one of our tricks was to use the GPU to downsize the video frames we got from SurfaceTexture (actually, back then, it was a different API to get the video into a texture, but the result was the same). We rendered to a smaller FBO, and bilerp texture filtering allowed us to downsize the image filtered. Then, we read back the smaller image to the CPU. While not optimal, it was better than reading the entire image back.

Basically, it all comes down to the nature of the per-pixel processing you have to do to the video. if it can be done efficiently on the GPU with shaders, you could really win out and avoid the 3rd party libs.

Hey lbishop
I have use BitmapFactory to generate bitmap data from jpeg image,
and performance is good,but now I got
JNI ERROR (app bug): local reference table overflow (max=512)
I have call Bitmap recycle and DeleteLocalRef,but seems somethnig I haven’t release reference yet,
is anything wrong with code?
thanks for the JNI side hint help :)

JNIEnv * g_pMultiThreadEnv = 0;
bool JpegToRawPixelData(char*e_pJpegData,int e_iLength,char*e_pStorePixelData)
	//JNIEnv * env = cGameApp::m_spThreadEnv;
	JNIEnv * env = g_pMultiThreadEnv;
    jbyteArray byte_array = env->NewByteArray(e_iLength);
    env->SetByteArrayRegion(byte_array, 0, e_iLength, (jbyte *)e_pJpegData);
    //get the BitmapFactory class
    jclass bitmap_factory_class = env->FindClass("android/graphics/BitmapFactory");
    jmethodID decode_byte_array_method = env->GetStaticMethodID(bitmap_factory_class,"decodeByteArray", "([BII)Landroid/graphics/Bitmap;");
    //get the bitmap itself
    jobject jbitmap = env->CallStaticObjectMethod(bitmap_factory_class, decode_byte_array_method,byte_array, 0, e_iLength);
	AndroidBitmapInfo bitmapInfo;
	int ret;
	if ((ret = AndroidBitmap_getInfo(env, jbitmap, &bitmapInfo)) < 0)
		return false;
	if(bitmapInfo.format != ANDROID_BITMAP_FORMAT_RGBA_8888)
		return false;
	void* bitmapPixels;
	if ((ret = AndroidBitmap_lockPixels(env, jbitmap, &bitmapPixels)) < 0)
		return false;
	uint32_t* src = (uint32_t*) bitmapPixels;
	int pixelsCount = bitmapInfo.height * bitmapInfo.width;
	memcpy(e_pStorePixelData, src, sizeof(uint32_t) * pixelsCount);
	AndroidBitmap_unlockPixels(env, jbitmap);
	//recycle bitmap
	jclass bitmapCls = env->GetObjectClass(jbitmap);
	jmethodID l_RecycleFunction = env->GetMethodID(bitmapCls, "recycle", "()V");
	if (l_RecycleFunction == 0)
		return false;
	env->CallVoidMethod(jbitmap, l_RecycleFunction);
	return true;

sorry I forgot to call
now it just work fine