Using Java API¶
IDLive Face needs Java 8 and newer. Desktop releases support x86 64-bit only. Android AAR releases support Armv7-A, Armv8-A and x86.
Set up¶
The package for the IDLive Face Python API is net.idrnd.idliveface
:
import net.idrnd.idliveface.*;
The jar file with the package is can be found in the SDK's libs/java
directory. You need to add the jar to the classpath when running java
or javac
:
java -cp "/opt/idliveface/libs/java/idliveface.jar" ...
As the Java package is backed by native libraries you also need to specify their location. They reside in the SDK's libs
directory. The approach is different depending on the operating system used:
-
For Linux you need to set
java.library.path
property:java -cp "/opt/idliveface/libs/java/idliveface.jar" -Djava.library.path="/opt/idliveface/libs" ...
-
For Windows you need to add the directory to the
Path
environment variable. Please refer to the DLL discovery article.
It is only required when you run a Java program. The compiler doesn't need the native libraries.
Closable objects¶
Some of the API objects are backed by the native memory and implement AutoCloseable
interface. You always need to close them, either manually with the close
method, or using the try-with-resources statement:
try (Image image = decoder.decodeFile(...)) {
...
}
Initialize¶
The entry point to the IDLive Face is called the Blueprint
. It's a factory for all main SDK objects, and it's also used to configure the IDLive Face. Initialize it with the path to the SDK's data
directory (we assume the SDK was installed to /opt/idliveface
):
import java.nio.file.*;
Path IDLIVEFACE_HOME = Paths.get("/opt/idliveface");
Blueprint blueprint = new Blueprint(IDLIVEFACE_HOME.resolve("data"));
Next use the blueprint to create the image decoder and the face analyzer:
ImageDecoder decoder = blueprint.createImageDecoder();
FaceAnalyzer analyzer = blueprint.createFaceAnalyzer();
You don't need the blueprint afterwards. IDLive Face reports all error via exceptions, you need to catch them in your exception handler:
try {
...
} catch (IDLiveFaceException e) {
System.out.println("Error: " + e.getMessage());
}
Load image¶
To load a compressed image from a file, use the image decoder:
Image image = decoder.decodeFile(IDLIVEFACE_HOME.resolve("examples/images/real_face.jpg"));
IDLive Face supports JPEG, PNG and BMP formats. You can also load the image from the memory:
byte[] imageContent =
Files.readAllBytes(IDLIVEFACE_HOME.resolve("examples/images/real_face.jpg"));
Image image = decoder.decode(imageContent));
If the image can't be decoded, the decoder will throw ImageDecodingException
.
If you have an uncompressed image, pass its content and dimensions to the Image
's constructor:
byte[] pixels = ...;
int width = 600;
int height = 800;
Image image = new Image(pixels, width, height, PixelFormat.RGB);
IDLive Face supports RGB/BGR (3 bytes per pixel) and grayscale (1 byte per pixel) formats.
Analyze image¶
To analyze the image, use the face analyzer:
FaceAnalysisResult result = analyzer.analyze(image);
Once the analysis is complete, you need to first check the status
field. If it's set to INVALID
, the image has failed to pass validations and was not analyzed. Such images do not conform to the requirements, and it's either not possible to analyze them, or produce the result with the decent accuracy. All validations the image has failed to pass are listed in the failedValidations
field.
if (result.getStatus() == FaceStatus.INVALID) {
System.out.println("Image did not pass validations and is not suitable for analysis");
System.out.println("Failed validations: " + result.getFailedValidations());
}
For the images that pass the validations, the status
will be either GENUINE
or NOT_GENUINE
. The result's genuineProbability
field will also be set. It's a float value showing how confident we are if the face on the image belongs to a real person, with 1 being the most confident. If the value is closer to 0, the face does not look genuine and may be spoofed. The decision threshold is 0.5. When the probability is this high or higher, the status
will be set to GENUINE
. For lower probabilities it will be set to NOT_GENUINE
.
So you decision logic may look like this:
if (result.getStatus() == FaceStatus.INVALID) {
// Ask to retake the image, use result.getFailedValidations() as hints
} else if (result.getStatus() == FaceStatus.NOT_GENUINE) {
// Reject the image
} else {
// Accept the image
}
Additionally the result
contains a bounding box around the processed face. It will be returned when the face was detected, no matter if the image was analyzed or not. For example, if the face on the image is too small, the status
will be INVALID
, but the box
field will still be set. You can use it to diagnose why the image was not analyzed, along with failedValidations
.
if (result.getBox().isPresent()) {
System.out.println("Detected face: " + result.getBox().get());
}
Example program¶
Here is a program that uses everything described in the previous section:
Example.java
import net.idrnd.idliveface.*;
import java.nio.file.Path;
import java.nio.file.Paths;
public class Example {
private static final Path IDLIVEFACE_HOME = Paths.get("/opt/idliveface");
private final ImageDecoder decoder;
private final FaceAnalyzer analyzer;
public FaceAnalyzerExample() {
try (Blueprint blueprint = new Blueprint(IDLIVEFACE_HOME.resolve("data"))) {
System.out.println("Initializing...");
decoder = blueprint.createImageDecoder();
analyzer = blueprint.createFaceAnalyzer();
}
}
public void run() {
System.out.println("\nAnalyzing real_face.jpg");
try (Image image =
decoder.decodeFile(IDLIVEFACE_HOME.resolve("examples/images/real_face.jpg"))) {
FaceAnalysisResult result = analyzer.analyze(image);
processResult(result);
}
System.out.println("\nAnalyzing spoof_face.jpg");
try (Image image =
decoder.decodeFile(IDLIVEFACE_HOME.resolve("examples/images/spoof_face.jpg"))) {
FaceAnalysisResult result = analyzer.analyze(image);
processResult(result);
}
System.out.println("\nAnalyzing face_is_occluded.jpg");
try (Image image =
decoder.decodeFile(IDLIVEFACE_HOME.resolve("examples/images/face_is_occluded.jpg"))) {
FaceAnalysisResult result = analyzer.analyze(image);
processResult(result);
}
}
private static void processResult(FaceAnalysisResult result) {
System.out.println("Result: " + result);
if (result.getStatus() != FaceStatus.INVALID) {
System.out.println(
"Image is " + result.getStatus() + ", probability is " + result.getGenuineProbability().get());
} else {
System.out.println("Image did not pass validations and is not suitable for analysis");
System.out.println("Failed validations: " + result.getFailedValidations());
}
}
public static void main(String[] args) {
FaceAnalyzerExample example = new FaceAnalyzerExample();
example.run();
}
}
Using on Android¶
To bootstrap the SDK use the AndroidSupport
utility:
import net.idrnd.idliveface.android.AndroidSupport;
import android.content.Context;
Context context = ...;
Blueprint blueprint = AndroidSupport.createBlueprint(context);
This will extract the data files and create the blueprint that uses them. Files are only extracted at the first run of the application, and then cached in the filesystem.