Class LangChain4j

java.lang.Object
com.google.adk.models.BaseLlm
com.google.adk.models.langchain4j.LangChain4j

public abstract class LangChain4j extends BaseLlm
  • Constructor Details

    • LangChain4j

      public LangChain4j(dev.langchain4j.model.chat.ChatModel chatModel)
    • LangChain4j

      public LangChain4j(dev.langchain4j.model.chat.ChatModel chatModel, String modelName)
    • LangChain4j

      public LangChain4j(dev.langchain4j.model.chat.StreamingChatModel streamingChatModel)
    • LangChain4j

      public LangChain4j(dev.langchain4j.model.chat.StreamingChatModel streamingChatModel, String modelName)
    • LangChain4j

      public LangChain4j(dev.langchain4j.model.chat.ChatModel chatModel, dev.langchain4j.model.chat.StreamingChatModel streamingChatModel, String modelName)
  • Method Details

    • chatModel

      public abstract @Nullable dev.langchain4j.model.chat.ChatModel chatModel()
    • streamingChatModel

      public abstract @Nullable dev.langchain4j.model.chat.StreamingChatModel streamingChatModel()
    • objectMapper

      public abstract com.fasterxml.jackson.databind.ObjectMapper objectMapper()
    • modelName

      public abstract String modelName()
    • tokenCountEstimator

      public abstract @Nullable dev.langchain4j.model.TokenCountEstimator tokenCountEstimator()
    • model

      public String model()
      Description copied from class: BaseLlm
      Returns the name of the LLM model.
      Overrides:
      model in class BaseLlm
      Returns:
      The name of the LLM model.
    • builder

      public static LangChain4j.Builder builder()
    • generateContent

      public io.reactivex.rxjava3.core.Flowable<LlmResponse> generateContent(LlmRequest llmRequest, boolean stream)
      Description copied from class: BaseLlm
      Generates one content from the given LLM request and tools.
      Specified by:
      generateContent in class BaseLlm
      Parameters:
      llmRequest - The LLM request containing the input prompt and parameters.
      stream - A boolean flag indicating whether to stream the response.
      Returns:
      A Flowable of LlmResponses. For non-streaming calls, it will only yield one LlmResponse. For streaming calls, it may yield more than one LlmResponse, but all yielded LlmResponses should be treated as one content by merging their parts.
    • connect

      public BaseLlmConnection connect(LlmRequest llmRequest)
      Description copied from class: BaseLlm
      Creates a live connection to the LLM.
      Specified by:
      connect in class BaseLlm