In Java

To get the sraix functionallities working again I was playing with the metasearchservice searx's API. I made a maven project (yeah I guess) in eclipse and managed to get the right http libraries to do the get request and I got an OK from the Testlink that comes in the documentation of org.apache.http.HttpEntity;. When I try to enter the searx API I get an

So I tried with the com.squareup.okhttp library and got the same exception at the same point with following code:


import com.squareup.okhttp.OkHttpClient;

import com.squareup.okhttp.Request;

import com.squareup.okhttp.Response;


    public class PostMan {

      OkHttpClient client = new OkHttpClient();


      // code request code here

      String doGetRequest(String url) throws IOException {

      Request request = new Request.Builder()



      //.addHeader("Cache-Control", "no-cache")

      //.addHeader("Postman-Token", "4662b15f-6a54-4222-4b92-471245ad6c92")



        Response response = client.newCall(request).execute();

        return response.body().string();




      public static void main(String[] args) throws IOException {


        // issue the Get request

        PostMan example = new PostMan();

        String getResponse = example.doGetRequest("");




The only fix from the internet I got was to add the certificate manually to the jvm. But This is not really usefull as I wanted to use it in myrobotlab and Dotn want to tell everybody to add some certificates....
For test purposes I started writing some code in python, where I installed the request library and didnt get the error. Thats the script:

import requests

import random


url = ""


#Get Information from a searx-search content-box

def getInfo(query):

    querystring = {"q":query,"format":"json"}

    headers = {

        'Cache-Control': "no-cache"


    response = requests.request("GET", url, headers=headers, params=querystring).json()

    infoboxes = response["infoboxes"][0]

    finalRepsonse = infoboxes["content"]



#Get Images from a searx-search

def getImage(query):

    querystring = {"q":query,"format":"json","categories":"images"}

    headers = {

        'Cache-Control': "no-cache"


    response = requests.request("GET", url, headers=headers, params=querystring).json()

    results = response["results"]

    numberOfResults = len(results)

    if (numberOfResults > 20):

        numberOfResults = 20

    myRandom = random.randint(0,numberOfResults)

    result = results[myRandom]

    finalResponse = result["img_src"]

    if finalResponse.startswith("//"):

        finalResponse = finalResponse[2:]




print (getImage("boat"))



It gets some randomlinks of pictures of boats if its executed! The info feature does only work for some of my tests and if the result is "none" it should be retriedby another service like wikidata.



So the main question is: How to I get rid of the SSL-Exception so I can write it in java to possibly wrap it in a searx service afterwards to use this together with OpenWeather, WikiData,WolframAlpha etc for the sraix commands?


Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
hairygael's picture

That would be nice! It would

That would be nice!

It would give an alternative to ask.pannous in ProgramAB.

GroG's picture

Hi Kakadu, I'm guessing the

Hi Kakadu,

I'm guessing the site has a "new" root or issuing CA which Java 8 does not have.

There are a couple of options possible :

Programmatically installing the cert 

InstallCert.main(new String[]{""})

or configure HttpClient to accept 'any' cert.

I prefer the second, but at the moment this option is not exposed currently, I don't think it would take much.

Kakadu31's picture

This is the code I tried to

This is the code I tried to use with the apache httpclient, It tries a get and a post request:

package searxTest;


import java.util.ArrayList;

import java.util.List;


import org.apache.http.HttpEntity;

import org.apache.http.NameValuePair;

import org.apache.http.client.entity.UrlEncodedFormEntity;

import org.apache.http.client.methods.CloseableHttpResponse;

import org.apache.http.client.methods.HttpGet;

import org.apache.http.client.methods.HttpPost;

import org.apache.http.impl.client.CloseableHttpClient;

import org.apache.http.impl.client.HttpClients;

import org.apache.http.message.BasicNameValuePair;

import org.apache.http.util.EntityUtils;


public class Main {


    public static void main(String[] args) throws Exception {

        CloseableHttpClient httpclient = HttpClients.createDefault();

        try {

            HttpGet httpGet = new HttpGet("");

            CloseableHttpResponse response1 = httpclient.execute(httpGet);

            // The underlying HTTP connection is still held by the response object

            // to allow the response content to be streamed directly from the network socket.

            // In order to ensure correct deallocation of system resources

            // the user MUST call CloseableHttpResponse#close() from a finally clause.

            // Please note that if response content is not fully consumed the underlying

            // connection cannot be safely re-used and will be shut down and discarded

            // by the connection manager.

            try {


                HttpEntity entity1 = response1.getEntity();

                // do something useful with the response body

                // and ensure it is fully consumed


            } finally {




            HttpPost httpPost = new HttpPost("");

            List <NameValuePair> nvps = new ArrayList <NameValuePair>();

            nvps.add(new BasicNameValuePair("categories", "images"));

            httpPost.setEntity(new UrlEncodedFormEntity(nvps));

            CloseableHttpResponse response2 = httpclient.execute(httpPost);


            try {


                HttpEntity entity2 = response2.getEntity();

                // do something useful with the response body

                // and ensure it is fully consumed


            } finally {



        } finally {