To get the sraix functionallities working again I was playing with the metasearchservice searx's API. I made a maven project (yeah I guess) in eclipse and managed to get the right http libraries to do the get request and I got an OK from the Testlink that comes in the documentation of org.apache.http.HttpEntity;. When I try to enter the searx API I get an javax.net.ssl.SSLHandshakeException.
So I tried with the com.squareup.okhttp library and got the same exception at the same point with following code:
----------------------------------------------------------------------------------------import java.io.IOException;
import com.squareup.okhttp.OkHttpClient;
import com.squareup.okhttp.Request;
import com.squareup.okhttp.Response;
public class PostMan {
OkHttpClient client = new OkHttpClient();
// code request code here
String doGetRequest(String url) throws IOException {
Request request = new Request.Builder()
.url("https://searx.laquadrature.net/?q=cat&format=json")
.get()
//.addHeader("Cache-Control", "no-cache")
//.addHeader("Postman-Token", "4662b15f-6a54-4222-4b92-471245ad6c92")
.build();
Response response = client.newCall(request).execute();
return response.body().string();
}
public static void main(String[] args) throws IOException {
// issue the Get request
PostMan example = new PostMan();
String getResponse = example.doGetRequest("https://searx.laquadrature.net/?q=cat&format=json");
System.out.println(getResponse);
}
}
import requests
import random
url = "https://searx.laquadrature.net/"
#Get Information from a searx-search content-box
def getInfo(query):
querystring = {"q":query,"format":"json"}
headers = {
'Cache-Control': "no-cache"
}
response = requests.request("GET", url, headers=headers, params=querystring).json()
infoboxes = response["infoboxes"][0]
finalRepsonse = infoboxes["content"]
return(finalRepsonse)
#Get Images from a searx-search
def getImage(query):
querystring = {"q":query,"format":"json","categories":"images"}
headers = {
'Cache-Control': "no-cache"
}
response = requests.request("GET", url, headers=headers, params=querystring).json()
results = response["results"]
numberOfResults = len(results)
if (numberOfResults > 20):
numberOfResults = 20
myRandom = random.randint(0,numberOfResults)
result = results[myRandom]
finalResponse = result["img_src"]
if finalResponse.startswith("//"):
finalResponse = finalResponse[2:]
return(finalResponse)
print (getImage("boat"))
----------------------------------------------------------------------------------------
It gets some randomlinks of pictures of boats if its executed! The info feature does only work for some of my tests and if the result is "none" it should be retriedby another service like wikidata.
So the main question is: How to I get rid of the SSL-Exception so I can write it in java to possibly wrap it in a searx service afterwards to use this together with OpenWeather, WikiData,WolframAlpha etc for the sraix commands?
That would be nice! It would
That would be nice!
It would give an alternative to ask.pannous in ProgramAB.
Hi Kakadu, I'm guessing the
Hi Kakadu,
I'm guessing the site has a "new" root or issuing CA which Java 8 does not have.
There are a couple of options possible :
Programmatically installing the cert
or configure HttpClient to accept 'any' cert.
I prefer the second, but at the moment this option is not exposed currently, I don't think it would take much.
This is the code I tried to
This is the code I tried to use with the apache httpclient, It tries a get and a post request:
package searxTest;
import java.util.ArrayList;
import java.util.List;
import org.apache.http.HttpEntity;
import org.apache.http.NameValuePair;
import org.apache.http.client.entity.UrlEncodedFormEntity;
import org.apache.http.client.methods.CloseableHttpResponse;
import org.apache.http.client.methods.HttpGet;
import org.apache.http.client.methods.HttpPost;
import org.apache.http.impl.client.CloseableHttpClient;
import org.apache.http.impl.client.HttpClients;
import org.apache.http.message.BasicNameValuePair;
import org.apache.http.util.EntityUtils;
public class Main {
public static void main(String[] args) throws Exception {
CloseableHttpClient httpclient = HttpClients.createDefault();
try {
HttpGet httpGet = new HttpGet("https://searx.laquadrature.net/?q=cat&format=json");
CloseableHttpResponse response1 = httpclient.execute(httpGet);
// The underlying HTTP connection is still held by the response object
// to allow the response content to be streamed directly from the network socket.
// In order to ensure correct deallocation of system resources
// the user MUST call CloseableHttpResponse#close() from a finally clause.
// Please note that if response content is not fully consumed the underlying
// connection cannot be safely re-used and will be shut down and discarded
// by the connection manager.
try {
System.out.println(response1.getStatusLine());
HttpEntity entity1 = response1.getEntity();
// do something useful with the response body
// and ensure it is fully consumed
EntityUtils.consume(entity1);
} finally {
response1.close();
}
HttpPost httpPost = new HttpPost("https://searx.laquadrature.net/?q=cat&format=json");
List <NameValuePair> nvps = new ArrayList <NameValuePair>();
nvps.add(new BasicNameValuePair("categories", "images"));
httpPost.setEntity(new UrlEncodedFormEntity(nvps));
CloseableHttpResponse response2 = httpclient.execute(httpPost);
try {
System.out.println(response2.getStatusLine());
HttpEntity entity2 = response2.getEntity();
// do something useful with the response body
// and ensure it is fully consumed
EntityUtils.consume(entity2);
} finally {
response2.close();
}
} finally {
httpclient.close();
}
}
}