class KafkaConsumer extends KafkaConsumerInterface with LazyLogging
A Kafka Client that uses Pekko Connectors Kafka Consumer under the hood to create a stream of events from a Kafka
cluster. To configure the Pekko Connectors Kafka Consumer use the standard typesafe configuration i.e.
pekko.kafka.consumer (note that the keySerializer and valueSerializer are hardcoded so you cannot override
this).
- Alphabetic
- By Inheritance
- KafkaConsumer
- LazyLogging
- KafkaConsumerInterface
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Instance Constructors
-    new KafkaConsumer(configureConsumer: Option[(ConsumerSettings[Array[Byte], Array[Byte]]) => ConsumerSettings[Array[Byte], Array[Byte]]] = None, configureCommitter: Option[(CommitterSettings) => CommitterSettings] = None)(implicit system: ActorSystem, kafkaClusterConfig: KafkaCluster, backupConfig: Backup)- configureConsumer
- A way to configure the underlying Kafka consumer settings 
- configureCommitter
- A way to configure the underlying kafka committer settings 
- system
- A classic - ActorSystem
- kafkaClusterConfig
- Additional cluster configuration that is needed 
 
Type Members
-    type BatchedCursorContext = CommittableOffsetBatchThe type that represents the result of batching a CursorContextThe type that represents the result of batching a CursorContext- Definition Classes
- KafkaConsumer → KafkaConsumerInterface
 
-    type Control = org.apache.pekko.kafka.scaladsl.Consumer.ControlThe type that represents how to control the given stream, i.e. The type that represents how to control the given stream, i.e. if you want to shut it down or add metrics - Definition Classes
- KafkaConsumer → KafkaConsumerInterface
 
-    type CursorContext = CommittableOffsetThe type of the context to pass around. The type of the context to pass around. In context of a Kafka consumer, this typically holds offset data to be automatically committed - Definition Classes
- KafkaConsumer → KafkaConsumerInterface
 
-    type MatCombineResult = DrainingControl[Done]The type that represents the result of the combineparameter that is supplied to pekko.stream.scaladsl.Source.toMatThe type that represents the result of the combineparameter that is supplied to pekko.stream.scaladsl.Source.toMat- Definition Classes
- KafkaConsumer → KafkaConsumerInterface
 
Value Members
-   final  def !=(arg0: Any): Boolean- Definition Classes
- AnyRef → Any
 
-   final  def ##: Int- Definition Classes
- AnyRef → Any
 
-   final  def ==(arg0: Any): Boolean- Definition Classes
- AnyRef → Any
 
-   final  def asInstanceOf[T0]: T0- Definition Classes
- Any
 
-    def batchCursorContext(cursors: Iterable[CommittableOffset]): CommittableOffsetBatchHow to batch an immutable iterable of CursorContextinto aBatchedCursorContextHow to batch an immutable iterable of CursorContextinto aBatchedCursorContext- cursors
- The cursors that need to be batched 
- returns
- A collection data structure that represents the batched cursors 
 - Definition Classes
- KafkaConsumer → KafkaConsumerInterface
 
-    def clone(): AnyRef- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @HotSpotIntrinsicCandidate() @native()
 
-    def commitCursor: Sink[CommittableOffsetBatch, Future[Done]]- returns
- A - Sinkthat allows you to commit a- CursorContextto Kafka to signify you have processed a message
 - Definition Classes
- KafkaConsumer → KafkaConsumerInterface
 
-   final  def eq(arg0: AnyRef): Boolean- Definition Classes
- AnyRef
 
-    def equals(arg0: AnyRef): Boolean- Definition Classes
- AnyRef → Any
 
-   final  def getClass(): Class[_ <: AnyRef]- Definition Classes
- AnyRef → Any
- Annotations
- @HotSpotIntrinsicCandidate() @native()
 
-    def getSource: SourceWithContext[ReducedConsumerRecord, CommittableOffset, org.apache.pekko.kafka.scaladsl.Consumer.Control]- returns
- A - SourceWithContextthat returns a Kafka Stream which automatically handles committing of cursors
 - Definition Classes
- KafkaConsumer → KafkaConsumerInterface
 
-    def hashCode(): Int- Definition Classes
- AnyRef → Any
- Annotations
- @HotSpotIntrinsicCandidate() @native()
 
-   final  def isInstanceOf[T0]: Boolean- Definition Classes
- Any
 
-    lazy val logger: Logger- Attributes
- protected
- Definition Classes
- LazyLogging
- Annotations
- @transient()
 
-    def matCombine: (org.apache.pekko.kafka.scaladsl.Consumer.Control, Future[Done]) => DrainingControl[Done]- returns
- The result of this function gets directly passed into the - combineparameter of pekko.stream.scaladsl.Source.toMat
 - Definition Classes
- KafkaConsumer → KafkaConsumerInterface
 
-   final  def ne(arg0: AnyRef): Boolean- Definition Classes
- AnyRef
 
-   final  def notify(): Unit- Definition Classes
- AnyRef
- Annotations
- @HotSpotIntrinsicCandidate() @native()
 
-   final  def notifyAll(): Unit- Definition Classes
- AnyRef
- Annotations
- @HotSpotIntrinsicCandidate() @native()
 
-   final  def synchronized[T0](arg0: => T0): T0- Definition Classes
- AnyRef
 
-    def toString(): String- Definition Classes
- AnyRef → Any
 
-   final  def wait(arg0: Long, arg1: Int): Unit- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
 
-   final  def wait(arg0: Long): Unit- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()
 
-   final  def wait(): Unit- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])