Skip to content

Commit 071fa4f

Browse files
authored
Merge pull request #305 from redhat-developer-demos/camelk-patch
Fix typos
2 parents fa0141a + b5935f6 commit 071fa4f

1 file changed

Lines changed: 20 additions & 19 deletions

File tree

documentation/modules/advanced/pages/camel-k-cbr.adoc

Lines changed: 20 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -4,15 +4,15 @@ include::_attributes.adoc[]
44

55
At the end of this chapter you will be able to:
66

7-
- How to run integrate Apache Kafka and Camel-K
8-
- Apply Content Based Routing (CBR) Enterprise Integration Pattern(EIP)
7+
- How to integrate Apache Kafka and Camel-K
8+
- Apply the Content Based Routing (CBR) Enterprise Integration Pattern(EIP)
99

1010
Apache Camel supports numerous Enterprise Integration Patterns (EIPs) out-of-the-box, you can find the complete list of patterns on the Apache Camel https://camel.apache.org/manual/latest/enterprise-integration-patterns.html[website].
1111

1212
[sidebar]
1313
.Content Based Router
1414
****
15-
The Content Based Router examines the message content and routes the message to a different channel based on the data contained in the message. The routing can be based on a number of criteria such as existence of fields, specific field values etc. When implementing a Content Based Router, special caution should be taken to make the routing function easy to maintain as the router can become a point of frequent maintenance. In more sophisticated integration scenarios, the Content Based Router can take on the form of a configurable rules engine that computes the destination channel based on a set of configurable rules. footnote:[https://www.enterpriseintegrationpatterns.com/patterns/messaging/ContentBasedRouter.html]
15+
The Content Based Router examines the message content and routes messages to a different channel based on the data contained in the message. The routing can be based on a number of criteria such as existence of fields, specific field values etc. When implementing a Content Based Router, special caution should be taken to make the routing function easy to maintain as the router can become a point of frequent maintenance. In more sophisticated integration scenarios, the Content Based Router can take on the form of a configurable rules engine that computes the destination channel based on a set of configurable rules. footnote:[https://www.enterpriseintegrationpatterns.com/patterns/messaging/ContentBasedRouter.html]
1616
****
1717

1818
[[cbr-app-overview]]
@@ -25,16 +25,16 @@ image::cbr_app_overview.png[align="center"]
2525

2626
The application has following components,
2727

28-
- *Data Producer*: The Camel-K integration application, that will produce data simulating the streaming data by sending the data to https://kafka.apache.org[Apache Kafka]
29-
- *Data Processor*: The Camel-K integration application, that will process the streaming data from Kafka and send the default Knative Eventing Broker
30-
- *Event Subscriber(Fruits UI)*: The https://quarkus.io[Quarkus] Java application, that will display the processed data from the Data Processor
31-
- *Event Trigger*: This is xref:knative-tutorial-eventing:ROOT:eventing-trigger-broker.adoc[Knative Event Trigger] that apply a filter on the processed data, to send to the Event Subscriber
28+
- *Data Producer*: A Camel-K integration application which will produce data simulating the streaming data by sending the data to https://kafka.apache.org[Apache Kafka]
29+
- *Data Processor*: A Camel-K integration application which will process the streaming data from Kafka and send the default Knative Eventing Broker
30+
- *Event Subscriber(Fruits UI)*: A https://quarkus.io[Quarkus] Java application, that will display the processed data from the Data Processor
31+
- *Event Trigger*: A xref:knative-tutorial-eventing:ROOT:eventing-trigger-broker.adoc[Knative Event Trigger] that applies a filter on the processed data to send to the Event Subscriber
3232

33-
The upcoming samples will deploy these individual components and finally we test the integration by wiring them all together.
33+
The upcoming samples will deploy these individual components and after that we will test the integration by wiring them all together.
3434

3535
Just make sure:
3636

37-
* Review xref:eventing/eventing.adoc[Knative Eventing] module to refresh the concepts
37+
* To review the xref:eventing/eventing.adoc[Knative Eventing] module to refresh the concepts
3838
* Apache Kafka xref:deploy-apache-kafka.adoc[my-cluster] is running
3939

4040
:service-file: default-broker.yaml
@@ -78,33 +78,34 @@ cd $TUTORIAL_HOME/{camelk-repo}
7878
[#camel-k-cbr-data-producer]
7979
== Deploy Data Producer
8080

81-
Knative Camel-K integration called `fruits-producer` which will use a public http://fruityvice.com[fruits API] to retrieve the information about fruits and stream the data to Apache Kafka. The `fruits-producer` service retrieves the data from the fruits API, splits it using the https://camel.apache.org/manual/latest/split-eip.html[Split EIP] and then sends the data to a Kafka topic called `fruits`.
81+
This is a Knative Camel-K integration called `fruits-producer` which will use a public http://fruityvice.com[fruits API] to retrieve the information about fruits and stream the data to Apache Kafka. The `fruits-producer` service retrieves the data from the fruits API, splits it using the https://camel.apache.org/manual/latest/split-eip.html[Split EIP] and then sends the data to a Kafka topic called `fruits`.
8282

8383
.Fruits producer
8484
[source,yaml]
8585
----
8686
- from:
8787
uri: timer:tick
8888
parameters:
89-
period: 5000
89+
period: 5000#<1>
9090
steps:
9191
- set-header:
9292
name: CamelHttpMethod
9393
constant: GET
94-
- to: "https://fruityvice.com/api/fruit/all"#<1>
94+
- to: "https://fruityvice.com/api/fruit/all"#<2>
9595
- split:
96-
jsonpath: "$.[*]"#<2>
96+
jsonpath: "$.[*]"#<3>
9797
- marshal:
9898
json: {}
9999
- log:
100100
message: "${body}"
101-
- to: "kafka:fruits?brokers=my-cluster-kafka-bootstrap.kafka:9092"#<3>
101+
- to: "kafka:fruits?brokers=my-cluster-kafka-bootstrap.kafka:9092"#<4>
102102
103103
----
104104

105105
<1> Poll every 5 seconds from the REST API http://fruityvice.com.
106106
<2> Call the external REST API http://fruityvice.com to get the list of fruits to simulate the data streaming
107-
<3> Send the processed data i.e. the individual fruit record as JSON to Apache Kafka Topic
107+
<3> Split the message
108+
<4> Send the processed data i.e. the individual fruit record as JSON to Apache Kafka Topic
108109

109110
Run the following command to deploy the `fruit-producer` integration:
110111

@@ -362,7 +363,7 @@ fruits-processor-to-knative True 2m22s
362363
[[camel-k-cbr-event-subscriber]]
363364
== Deploy Event Subscriber
364365

365-
Let us now deploy a https://en.wikipedia.org/wiki/Reactive_programming[Reactive] Web application called `fruit-events-display`. It is a https://quarkus.io[Quarkus] Java application, that will update UI(reactively) as and when it receives the processed data from the Knative Eventing backend.
366+
Now deploy a https://en.wikipedia.org/wiki/Reactive_programming[Reactive] Web application called `fruit-events-display`. It is a https://quarkus.io[Quarkus] Java application, that will update UI(reactively) as and when it receives the processed data from the Knative Eventing backend.
366367

367368
You can deploy the `fruit-events-display` application using the command:
368369

@@ -434,7 +435,7 @@ image::cbr_app_ui_empty.png[align="center"]
434435
[[camel-k-cbr-event-filter]]
435436
== Apply Knative Filter
436437

437-
As a last step let us now deploy a Knative Event Trigger called `fruits-trigger`. The trigger consumes the events from the Knative Event Broker named `default`, when the fruit event is received it will dispatch the events to the subscriber -- that is `fruit-events-display` service --.
438+
As a last step we need to deploy a Knative Event Trigger called `fruits-trigger`. The trigger consumes the events from the Knative Event Broker named `default`, when the fruit event is received it will dispatch the events to the subscriber which is the `fruit-events-display` service.
438439

439440
[source,yaml]
440441
----
@@ -476,7 +477,7 @@ Let us check the status of the Trigger using the command `kubectl -n {tutorial-n
476477
kubectl -n {tutorial-namespace} get triggers
477478
----
478479

479-
As the trigger will dispatch its filtered event to `fruit-events-display` , the subscriber URI of the Trigger will be that of `fruit-events-display` service.
480+
As the trigger will dispatch its filtered event to `fruit-events-display` , the subscriber URI of the Trigger will the `fruit-events-display` service.
480481

481482
[.console-output]
482483
[source,bash,subs="+quotes,+attributes,+macros"]
@@ -488,7 +489,7 @@ sugary-fruits True default http://fruits-events-display.knativetutorial.svc.
488489
[[verify-e2e]]
489490
== Verify end to end
490491

491-
Now that we have all the components for the <<cbr-app-overview>>, let us verify the end to end flow:
492+
Now that we have all the components for the <<cbr-app-overview>>, let's verify the end to end flow:
492493

493494
To verify the data flow and processing call the `fruits-producer` service using the script `$TUTORIAL_HOME/bin/call.sh` with parameters `fruits-producer` and ''.
494495

0 commit comments

Comments
 (0)