Simple AEM Integration with OpenAI For Sentiment Analysis Of Given Text

Kinjal P Darji
4 min readMar 21, 2024

--

Here I’ll demonstrate how AEM can be integrated with OpenAI using AEM servlet and services to produce Sentiment Analysis of a text input. We’ll use request parameter for this in this example but this can be extended to use text component in AEM too. Below are the classes those will need to be created in java for the integration.

  1. OpenAIAnalysisServlet
  2. OpenAIIntegrationService
  3. OpenAIIntegrationServiceImpl
  4. OpenAIIntegrationServiceConfiguration
  5. HttpClientService
  6. HttpClientServiceImpl
  7. HttpClientServiceConfiguration
  8. OSGI configuration file — com.aemtechblog.core.services.impl.OpenAIIntegrationServiceImpl.cfg.json

OpenAIAnalysisServet

In this we’ll call OpenAIIntegrationService which in turn will make an http request to OpenAI for the sentiment analysis in one word. Below is the code snippet for the same.



import java.io.IOException;

import javax.servlet.Servlet;
import javax.servlet.ServletException;

import org.apache.sling.api.SlingHttpServletRequest;
import org.apache.sling.api.SlingHttpServletResponse;
import org.apache.sling.api.servlets.SlingSafeMethodsServlet;
import org.osgi.service.component.annotations.Component;
import org.osgi.service.component.annotations.Reference;
import org.osgi.service.component.propertytypes.ServiceDescription;

import com.aemtechblog.core.services.OpenAIIntegrationService;

@Component(service = { Servlet.class }, property = { "sling.servlet.paths=/bin/sentimentanalysis", "sling.servlet.methods=GET",
"sling.servlet.extensions=txt" })
@ServiceDescription("Simple Open AI Sentiment Anlysis Servlet")
public class OpenAISentimentAnalysisServlet extends SlingSafeMethodsServlet {

private static final long serialVersionUID = 1L;

@Reference
OpenAIIntegrationService openAIIntegrationService;

protected void doGet(final SlingHttpServletRequest request, final SlingHttpServletResponse response)
throws ServletException, IOException {
String text = request.getParameter("text");

String analysis = openAIIntegrationService.getSentimentAnlysis(text);
response.getWriter().write(analysis);

}
}

OpenAIIntegrationService is the simple interface with the declaration of method getSentimentAnalysis(String inputString).

public interface OpenAIIntegrationService {

public String getSentimentAnlysis(String inputString) throws ClientProtocolException, IOException;

}

OpenAIIntegrationServiceImpl will implement the method getSentimentAnlysis. Also, this class will make use of the configuration to form HttpPost request with OpenAI understandable json string as payload and use HttpClientService to send the HttpPost request to OpenAI end point. Proper error handling can be done to improve below code. :)

import java.io.IOException;

import org.apache.http.HttpEntity;
import org.apache.http.client.ClientProtocolException;
import org.apache.http.client.methods.CloseableHttpResponse;
import org.apache.http.client.methods.HttpPost;
import org.apache.http.entity.StringEntity;
import org.apache.http.impl.client.CloseableHttpClient;
import org.apache.http.util.EntityUtils;
import org.osgi.service.component.annotations.Activate;
import org.osgi.service.component.annotations.Component;
import org.osgi.service.component.annotations.ConfigurationPolicy;
import org.osgi.service.component.annotations.Deactivate;
import org.osgi.service.component.annotations.Modified;
import org.osgi.service.component.annotations.Reference;
import org.osgi.service.metatype.annotations.Designate;

import com.aemtechblog.core.services.HttpClientService;
import com.aemtechblog.core.services.OpenAIIntegrationService;
import com.aemtechblog.core.services.configurations.OpenAIIntegrationServiceConfiguration;
import com.google.gson.JsonArray;
import com.google.gson.JsonObject;

@Component(service = OpenAIIntegrationService.class, configurationPolicy = ConfigurationPolicy.OPTIONAL, immediate = true)
@Designate(ocd = OpenAIIntegrationServiceConfiguration.class)

public class OpenAIIntegrationServiceImpl implements OpenAIIntegrationService {

private OpenAIIntegrationServiceConfiguration configuration;

@Reference
HttpClientService httpClientService;

@Activate
public void activate(final OpenAIIntegrationServiceConfiguration configuration) {
this.configuration = configuration;
}

@Modified
public void modified(final OpenAIIntegrationServiceConfiguration configuration) {
this.configuration = configuration;
}

@Deactivate
public void deactivate(final OpenAIIntegrationServiceConfiguration configuration) {
this.configuration = configuration;
}

@Override
public String getSentimentAnlysis(String inputString) throws ClientProtocolException, IOException {

return requestSentimentAnalysis(inputString);

}

private String requestSentimentAnalysis(String inputString) throws ClientProtocolException, IOException {

HttpPost httpPost = new HttpPost(this.configuration.llm_endpoint());
httpPost.addHeader("Content-Type", this.configuration.llm_content_type());
httpPost.addHeader("Authorization", this.configuration.llm_authorization());
String requestBody = getRequestBody(inputString);

httpPost.setEntity(new StringEntity(requestBody));

CloseableHttpClient closeableHttpClient = httpClientService.getHttpClient();

try (CloseableHttpResponse closeableHttpResponse = closeableHttpClient.execute(httpPost)) {
HttpEntity entity = closeableHttpResponse.getEntity();

if (entity != null) {
String apiResponse = EntityUtils.toString(entity);
return apiResponse;
}
}

return "";

}

private String getRequestBody(String inputString) {
JsonObject requestBody = new JsonObject();
requestBody.addProperty("model", this.configuration.llm_model());
requestBody.add("messages", getMessages(inputString));
requestBody.addProperty("max_tokens",1);
return requestBody.toString();
}

private JsonArray getMessages(String inputString) {
JsonObject message = new JsonObject();
message.addProperty("role", "user");
message.addProperty("content", this.configuration.prompt() + "\"" + inputString +"\"");
JsonArray messages = new JsonArray();
messages.add(message);

return messages;
}
}

OpenAIIntegrationServiceConfiguration

This will fetch required configurations like authorization token, model, message prompt, API end point etc from OSGI configuration. Here authorization will have string like “Bearer <auth token from OpenAI keys>”.



import org.osgi.service.metatype.annotations.AttributeDefinition;
import org.osgi.service.metatype.annotations.AttributeType;
import org.osgi.service.metatype.annotations.ObjectClassDefinition;

@ObjectClassDefinition(
name = "Open AI Integration Service",
description = "Provides Sentiment Analysis of the inputs text sent from AEM"
)
public @interface OpenAIIntegrationServiceConfiguration {

@AttributeDefinition(
name = "Enabled",
description = "Enable the service's functionalities.",
type = AttributeType.BOOLEAN)
boolean enabled() default true;

@AttributeDefinition(
name = "API Endpoint",
description = "Defines to which API endpoint requests are sent.",
type = AttributeType.STRING)
String llm_endpoint() default "https://api.openai.com/v1/chat/completions";

@AttributeDefinition(
name = "Model",
description = "Defines the model.",
type = AttributeType.STRING)
String llm_model() default "gpt-3.5-turbo";
@AttributeDefinition(
name = "Content Type",
description = "Defines the POST request's contentType header.",
type = AttributeType.STRING)
String llm_content_type() default "application/json";

@AttributeDefinition(
name = "Authorization",
description = "Defines the POST request's Authorization header.",
type = AttributeType.STRING)
String llm_authorization() default "";


@AttributeDefinition(
name = "Prompt",
description = "Provide the prompt text ",
type = AttributeType.STRING)
String prompt() default "Perform sentiment analysis of this text in one word from Positive, Negative, Neutral ";

}

Also, define HttpClientService, HttpClientServiceImpl, HttpClientServiceConfiguration to send the request to OpenAI once the HttpPost request is created.

Below is the configuration file that can be placed in ui.config/../apps/../osgiconfig/config.author.

{
"enabled": true,
"llm.endpoint": "https://api.openai.com/v1/chat/completions",
"llm.model": "gpt-3.5-turbo",
"llm.content.type": "application/json",
"llm.authorization": "Bearer <<>>",
"prompt": "Perform sentiment analysis of this text in one word from Positive, Negative, Neutral "
}

Below is the sample request in JSON format which is sent to OpenAI.

{
"model": "gpt-3.5-turbo",
"messages": [
{
"role": "user",
"content": "Perform sentiment analysis of this text in one word from Positive, Negative, Neutral \"this is the text\""
}
],
"max_tokens": 1
}

Below is the sample response sent by OpenAI.

{
.
.
.
"model": "gpt-3.5-turbo-0125",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "Neutral"
},
"logprobs": null,
"finish_reason": "length"
}
],
"usage": {
"prompt_tokens": 28,
"completion_tokens": 1,
"total_tokens": 29
},
.
.
.
}

Reference :

https://medium.com/@jlanssie/translate-entires-websites-in-aem-automatically-with-openai-944875cbfa22

--

--

Kinjal P Darji
Kinjal P Darji

Written by Kinjal P Darji

Hi, I am an AEM architect and a certified AWS Developer — Associate.