“Found unsupported keytype (18)” Error While Connecting to Kafka

Looks like older versions of Java, even often used ones (like SE 8), can generate this error if your JCE jar files are not updated. To be precise, JCE jars refer to Java Cryptography Extension Unlimited Strength Policy Files. Updating your two JCE jar files might help.

Grab a .zip of the files from Oracle. There should be two jars inside: local_policy.jar and US_export_policy.jar.

Copy/paste them to your JRE’s lib/security folder and overwrite the older ones. Your path will vary, but on a Windows machine it could be something like: C:/Program Files/Java/jdk.1.8.0_151/jre/lib/security/.

“Module not specified” Error in IntelliJ.

This happened after I renamed the modules to more descriptive logical names in IntelliJ 2018.3, Ultimate Edition. As a result, my two Maven module folders were no longer marked as modules in IntelliJ (no blue square in the bottom right corner of the folder icon in the Project panel). To solve the issue, I did the following:

  1. ”Unmark” any resources, java or test folders in each module folder (“Right click/Cmd+click, choose “Mark Directiry As”).
  2. Go to File > Project Structure, select Modules under Project Settings.
  3. Click the Copy icon (next to the + and – icons).
  4. In the Copy Module dialog that pops up, select the source folder for your module under “Module file location”, click OK.
  5. Open your Run Configuration screen. If “Use classpath of module” still only offers “no module”, back up any of your Run Config settings & options, then delete the old configuration and Add New Configuration (+ icon).



Note to Self: How to Copy, Clone, Duplicate a Multi-Module Maven Project in IntelliJ

I’ve tried this with IntelliJ Ultimate 2018.3 on Windows 10 and it worked.

1. Copy/duplicate original project folder inside /IdeaProjects/ via the OS file explorer (not IntelliJ).

2. Delete it’s workspace.xml file inside .idea folder.

3. Delete its .git folder (since you’re starting a new project, based off an existing one).

4. Open the project (do not Import) via IntelliJ

5. IntelliJ might complain about a missing Git directory. Go to File > Settings, Version Control – delete anything in red, like . Click Apply & Ok.

6. Double check your Project Structure panel, make sure Modules language level is set to appropriate values for your code.

Continue reading

Note to Self: Getting Apache Kafka Up & Running for Local Testing

This is a quick note on starting a single-broker Kafka 2.0.1 instance (Scala 2.11) on macOS, using Terminal. This is the standalone version (no Homebrew).

Terminal window 1

$ cd /Users/yourUsrName/pathToKafka/kafka_2.11-2.0.1
$ sh bin/zookeeper-server-start.sh config/zookeeper.properties

Terminal window 2

Optional: test zookeeper connection via telnet

$ telnet localhost 2181
$ stat

Move on to starting a Broker:

$ cd /Users/yourUsrName/pathToKafka/kafka_2.11-2.0.1
$ sh bin/kafka-server-start.sh config/server.properties

Terminal window 3

Create a Topic:

$ cd /Users/yourUsrName/pathToKafka/kafka_2.11-2.0.1
$ sh bin/kafka-topics.sh --create --topic my_topic_2 --zookeeper localhost:2181 --replication-factor 1 --partitions 1

For quicker local testing, I left the –partitions and –replication-factor set to 1.

See log files directory created here:

$ cd /tmp/kafka-logs/my_topic-0

See list of available Topics:

$ cd /Users/yourUsrName/pathToKafka/kafka_2.11-2.0.1
$ sh bin/kafka-topics.sh --list --zookeeper localhost:2181

Start a Producer

$ sh bin/kafka-console-producer.sh --broker-list localhost:9092 --topic my_topic_2

Once you get a carrot prompt, try typing some messages:

> How is this thing doing? Is it on?  
> Anyone?

Terminal window 4

Start a Consumer.

$ cd /Users/yourUsrName/pathToKafka/kafka_2.11-2.0.1
$ sh bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic my_topic_2 --from-beginning

Once the Consumer’s running without errors, you should see the messages you sent via the Producer window showing up.

Apache Kafka Producer Error “zookeeper is not a recognized option”

As I’m brushing up on Apache Kafka_2.11_2.0.1 (Scala 11, Kafka 2.0.1) on macOS Mojave, I ran into this minor hick up while trying to spin up a command line Producer:

$ sh bin/kafka-console-consumer.sh --zookeeper localhost:2181 --topic my_topic --from-beginning
zookeeper is not a recognized option

Turns out they changed that option name from “–zookeeper localhost:2181” to “–bootstrap-server localhost:9092. The new command looks like so:

$sh bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic my_topic --from-beginning
Your message shows up here

Cross Domain Error on Localhost on Any Error in Google Apps Script

I noticed an odd-feeling error while playing with a frontend AJAX call from a localhost server that fetched data from a Google Sheet via a Google Apps Script middleware script.

Any time there was any kind of error in the Google Apps Script code. The JSON-P-based front end call was receiving this message (using the Chrome dev tools Console):

Cross-Origin Read Blocking (CORB) blocked cross-origin response https://script.google.com/macros/s/-yourScriptID-/exec?action=rd with MIME type text/html. See https://www.chromestatus.com/feature/5629709824032768 for more details.</code>

Technically, of course, it was a showing up as a CORS (Cross Domain Resource Sharing) or CORB (Cross Domain Read Blocking) error.

I bet the Google Apps Script error message automatically sends data back as HTML/Text and this is messing with the JSON-P callback, which is expecting the data type to be “javascript”.