The Latest Gbloink!

In other ChatGPT news. And updating the post about Gbloink! I made a few months ago, there IS now a new Gbloink! I got GPT to translate it again, this time to JS. But more importantly to a new WebAudio based GM synth library called TinySynth

You can play Gbloink! Here

It’s actually pretty good / playable and fun now.

Another win for ChatGPT. I spent about 4.5 hours with it, knocking this into shape last week.

ChatGPT and Chronic Task Switching

I honestly don’t know if ChatGPT is the ultimate tool for me, or if it’s just feeding my madness.

Sitting here with 3 chats open: one is crunching up examples of voice-leading algorithms to add to my Python scripts for chord progression generation in FL Studio; a second is trying to debug my clojure build environment that’s failing to build the cljs part of Cardigan Bay; and the third is writing me a new static site generator using jinja2 templates (that I’ve never used before).

I can’t focus on one of these projects so now trying to do them in parallel by jumping from one chatGPT window to another.

Thing is, it’s helping me manage context switches because each chat retains the context. I also have multiple text editors open etc.

I don’t suppose I’m as efficient, as I would be if I did focus on one at a time. But I’m making enough progress forward in each of these projects to keep me engaged.

I’m thinking in terms of small blocks of code: eg. “write me the function which does X”. But not having to maintain the fine-grained concentration of writing the code itself. GPT knows how to make data structures and loop through them etc. well enough by itself.

So I get to zoom out, and think about algorithms / architecture etc. On the fly. And in several projects at once.

It’s intoxicatingly exciting. But is also kinda shockingly crazy.

ChatGPT teaches me Android

I’m the kind of idiot who spends six months writing my own framework in Java on Android, because I’m too stupid / lazy / impatient to read and grok the documentation and learn what’s already there.

ChatGPT (3.5!) taught me more in a 15 minute chat last night than I’d been able to figure out from looking up docs online.

Go figure.


Jetpack Compose Overview.

You

Can you explain to me, a person who has very little experience as an Android programmer, exactly what Jetpack Compose is, what it’s for, and how to use it?

ChatGPT

Certainly! Jetpack Compose is a modern UI toolkit for building Android user interfaces. It is a declarative UI framework developed by Google, designed to simplify and accelerate Android app development.

Traditionally, Android UI development involved using XML-based layouts and imperative programming with views and view groups. Jetpack Compose, on the other hand, allows you to define your UI using a more concise and intuitive Kotlin-based DSL (Domain-Specific Language). You describe what your UI should look like based on its current state, and the framework takes care of updating the UI when the state changes.

Here are some key features and benefits of Jetpack Compose:

  1. Declarative syntax: Compose uses a declarative approach, where you describe the desired state of your UI, and Compose takes care of updating the UI to match that state. This makes it easier to reason about your UI and helps in creating more maintainable code.
  2. Compose UI elements: Jetpack Compose provides a set of reusable UI elements, called “Composables,” which are functions that define a part of the UI. You can compose these elements together to build complex UI structures. Composables are highly customizable and can adapt to different screen sizes and orientations.
  3. State management: Compose provides a state management system that allows you to define and manage the state of your UI components. When the state changes, Compose automatically recomposes and updates the affected parts of the UI.
  4. Interactive and responsive UI: Compose makes it easy to handle user interactions, such as button clicks or text input. It also provides animations and transitions to create visually appealing and engaging user experiences.

To use Jetpack Compose, follow these steps:

  1. Set up your development environment: Make sure you have the latest version of Android Studio installed, as Jetpack Compose is tightly integrated with Android Studio.
  2. Create a new project: Start a new Android project in Android Studio or open an existing project.
  3. Enable Jetpack Compose: In your project’s build.gradle file, make sure you have the necessary dependencies for Jetpack Compose.
  4. Create Composables: Replace or modify your existing XML layouts with Compose code. You can define Composables by creating functions annotated with @Composable. These functions return UI elements or components.
  5. Compose the UI: Use the available Composables to build the desired UI structure. You can nest Composables, apply modifiers for customization, and handle user interactions.
  6. Manage state: Use state management features provided by Compose to handle changes in the UI. You can use mutable state variables and observe changes to trigger recomposition.
  7. Preview and test: Android Studio provides a preview feature for Jetpack Compose, allowing you to see the UI in real-time. Additionally, write tests to ensure the correctness of your UI code.

Jetpack Compose is still evolving, so it’s a good idea to refer to official documentation, tutorials, and sample code for more detailed and up-to-date information. The Android developer website and Jetpack Compose GitHub repository are excellent resources to start with.

I hope this gives you a good overview of Jetpack Compose and its usage in Android app development!

You

So it’s a bit like React for Android?

ChatGPT

Yes, there are similarities between Jetpack Compose and React in terms of their declarative nature and component-based architecture. Both frameworks allow you to build user interfaces by describing the desired state of the UI and automatically updating it when the state changes.

However, there are also some differences between the two:

  1. Language: Jetpack Compose is built specifically for Android app development using the Kotlin programming language, whereas React is primarily used for web development with JavaScript. Although React Native exists for mobile app development, it is a different framework from React itself.
  2. Platform-specific: Jetpack Compose is designed specifically for Android, leveraging the underlying Android platform APIs and features. React, on the other hand, is platform-agnostic and can be used for web, mobile, and desktop development across different platforms.
  3. Ecosystem: Jetpack Compose is part of the larger Android Jetpack library, which provides a wide range of components and libraries for various Android app development tasks. React, along with its ecosystem, has a broader reach and a vibrant community supporting web and mobile development.
  4. Tooling: Jetpack Compose is tightly integrated with Android Studio, the official IDE for Android development, providing a seamless development experience. React, on the other hand, has its own set of development tools and workflows, such as React Developer Tools and various build tools like Webpack.

While Jetpack Compose and React share some core concepts, it’s important to note that they are separate frameworks tailored for different platforms. If you’re familiar with React, you may find some similarities and principles applicable when working with Jetpack Compose, but there will still be differences in terms of syntax, APIs, and platform-specific considerations.

You

Ok. This is useful.

So Jetpack Compose is only part of Jetpack. What else is Jetpack and what is it used for?

ChatGPT

Jetpack is a set of libraries, tools, and guidance provided by Google to help developers build high-quality Android apps more easily and efficiently. It’s designed to address common challenges in Android app development, such as backward compatibility, app architecture, UI development, data management, and more.

Here are some key components and features of Jetpack:

  1. Architecture Components: Jetpack includes a set of libraries that help you design robust and maintainable app architectures. This includes libraries like ViewModel, LiveData, Room, and Paging, which provide tools for managing UI-related data, persisting data in a local database, and handling large data sets efficiently.
  2. UI Components: Jetpack provides libraries and tools for building modern user interfaces. Alongside Jetpack Compose, there are libraries like AppCompat, ConstraintLayout, RecyclerView, and Navigation, which simplify UI development, handle device compatibility, manage layouts, handle user navigation, and provide a consistent look and feel across different Android versions.
  3. Behavior Components: Jetpack offers libraries that address common behavioral patterns in Android apps. For example, WorkManager helps you manage background tasks and scheduling, while Notifications and Permissions provide streamlined ways to handle notifications and app permissions.
  4. Data Components: Jetpack provides libraries for working with data, such as Retrofit for networking, Room for local data persistence, and Paging for efficiently loading and displaying large data sets.
  5. Testing Components: Jetpack offers testing libraries and tools that make it easier to write unit tests and perform automated testing of Android apps. This includes libraries like AndroidX Test, Espresso for UI testing, and Mockito for mocking dependencies.
  6. Tooling: Jetpack encompasses a range of tools and integrations to enhance the development experience. This includes Android Studio, which provides features like code analysis, debugging, and visual UI editing support. The Jetpack Compose Preview in Android Studio allows you to see real-time updates of your Compose UI as you write code.
  7. Documentation and Samples: Jetpack provides extensive documentation, code samples, and best practices to guide developers in using its components effectively. The official Android developer website and the Android Jetpack GitHub repository are valuable resources for learning and reference.

Jetpack is designed to be modular, allowing you to choose the specific components that meet your app’s needs. You can integrate different Jetpack components into your app gradually, depending on your requirements and the level of functionality you want to leverage.

By using Jetpack, developers can benefit from a consistent set of APIs, reduced boilerplate code, increased productivity, and easier maintenance of Android apps. It promotes best practices and helps you build high-quality apps with a focus on stability, performance, and a great user experience.

You

Tell me more about the navigation part of Jetpack. I have an old Java Android app with many different “screens” I called “cards”. I wrote my own “card” handling code and my own finite state machine to manage the user’s navigation around different parts of the app.

In fact it was a game where different cards represented play screens score screens, splash screens and help screens.

It sounds like Jetpack would give me proper Android way to do all this. An I right? And how do I go about swapping out my custom code for Jetpack components?

ChatGPT

Absolutely! Jetpack Navigation is a Jetpack component that provides a framework for managing navigation in your Android app. It simplifies the implementation of navigation between different screens or destinations within your app, such as play screens, score screens, splash screens, and help screens.

With Jetpack Navigation, you can define the navigation structure of your app using a visual editor in Android Studio or by writing a navigation graph in XML. The navigation graph represents the screens or destinations in your app and the possible paths or actions that can be taken to navigate between them.

Here’s an overview of how you can use Jetpack Navigation to replace your custom code:

  1. Add the Navigation component: Ensure that you have the Navigation component added as a dependency in your app’s build.gradle file.
  2. Define destinations: Identify the different screens or cards in your app and define them as destinations in the navigation graph. This can be done either visually in the navigation editor or by writing the XML directly. Each destination represents a unique screen or card.
  3. Define actions: Specify the actions or paths between the different destinations in your navigation graph. For example, you can define actions to navigate from a play screen to a score screen or from a splash screen to a help screen.
  4. Handle user interactions: In your code, use the Navigation component to handle user interactions and trigger navigation actions. For example, you might have buttons or gestures that trigger navigation to a different screen. The Navigation component provides APIs to navigate to specific destinations or perform actions defined in the navigation graph.
  5. Pass data: If you need to pass data between screens, you can define arguments in your navigation graph and specify them when navigating between destinations. This allows you to pass information such as game scores or user preferences.
  6. Handle back navigation: Jetpack Navigation also simplifies handling the back button and back navigation within your app. It provides built-in support for the back stack and ensures that the back button behaves consistently across different screens.

By using Jetpack Navigation, you can benefit from a standardized and more maintainable approach to navigation. It helps you decouple navigation logic from individual screens and provides a clear structure for managing the flow of your app.

To integrate Jetpack Navigation into your existing app, you would need to:

  1. Convert your existing screens or cards into destinations in the navigation graph. Each screen or card becomes a separate destination, and the navigation graph defines the connections between them.
  2. Update your code to use the Navigation component for navigation instead of your custom handling. Replace your custom navigation code with calls to the Navigation component’s APIs to navigate between destinations.
  3. Modify any data passing mechanisms you have to align with the arguments defined in the navigation graph.
  4. Test your app thoroughly to ensure that the new navigation implementation works as expected.

The Android developer website provides comprehensive documentation and tutorials on using Jetpack Navigation, including guidance on setting up the navigation graph, defining actions, handling arguments, and handling back navigation. These resources will help you understand the details and best practices for implementing navigation in your app using Jetpack.

By adopting Jetpack Navigation, you’ll be able to leverage the standard Android approach to navigation, resulting in a more robust, maintainable, and consistent navigation experience for your users.

You

Sounds good. So say I have the following simple flow.

A first splash page. A menu to choose a help screen, play the game or show scores. The help screen itself. An intro to the play screen which leads to the play screen. And the scores screen. What would the XML version of this look like?

ChatGPT

Based on your description, the XML representation of the navigation graph for your simple flow would look like this:

<navigation xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    app:startDestination="@id/splashDestination">

    <fragment
        android:id="@+id/splashDestination"
        android:name="com.example.app.SplashFragment"
        android:label="Splash Fragment">
        <action
            android:id="@+id/action_splash_to_menu"
            app:destination="@id/menuDestination" />
    </fragment>

    <fragment
        android:id="@+id/menuDestination"
        android:name="com.example.app.MenuFragment"
        android:label="Menu Fragment">
        <action
            android:id="@+id/action_menu_to_help"
            app:destination="@id/helpDestination" />
        <action
            android:id="@+id/action_menu_to_play_intro"
            app:destination="@id/playIntroDestination" />
        <action
            android:id="@+id/action_menu_to_scores"
            app:destination="@id/scoresDestination" />
    </fragment>

    <fragment
        android:id="@+id/helpDestination"
        android:name="com.example.app.HelpFragment"
        android:label="Help Fragment" />

    <fragment
        android:id="@+id/playIntroDestination"
        android:name="com.example.app.PlayIntroFragment"
        android:label="Play Intro Fragment">
        <action
            android:id="@+id/action_play_intro_to_play"
            app:destination="@id/playDestination" />
    </fragment>

    <fragment
        android:id="@+id/playDestination"
        android:name="com.example.app.PlayFragment"
        android:label="Play Fragment" />

    <fragment
        android:id="@+id/scoresDestination"
        android:name="com.example.app.ScoresFragment"
        android:label="Scores Fragment" />

</navigation>

In this example, each destination is represented by a tag with an android:id attribute to uniquely identify it. The android:name attribute specifies the fully qualified class name of the fragment associated with that destination. The android:label attribute can be used to provide a human-readable label for the destination (e.g., for screen readers or debugging purposes).

Actions between destinations are defined using tags within the respective fragments. Each action is given a unique ID using the android:id attribute, and the app:destination attribute specifies the ID of the destination that the action leads to.

In this navigation graph:

  • The splashDestination is the starting destination defined as app:startDestination.
  • From the splashDestination, there is an action action_splash_to_menu leading to the menuDestination.
  • The menuDestination has actions action_menu_to_help, action_menu_to_play_intro, and action_menu_to_scores leading to the helpDestination, playIntroDestination, and scoresDestination, respectively.
  • The playIntroDestination has an action action_play_intro_to_play leading to the playDestination.
  • There are no actions defined for the helpDestination, playDestination, and scoresDestination, indicating that they are end destinations.

Note that you would need to replace the com.example.app package and fragment class names with the appropriate package and class names from your app.

This XML navigation graph defines the structure and connections between your different screens, allowing you to navigate between them using the Jetpack Navigation component.

Remember to integrate the navigation graph into your project and update your code to use the appropriate navigation APIs based on the IDs and actions defined

You

Bloody hell. I wrote something that looks almost identical to this (except I did it all in code so I had my own enum based state machine and classes representing each fragment and action). I wish I’d known this existed

ChatGPT

It’s great to hear that you had already implemented a navigation system similar to the Jetpack Navigation graph! While you may have implemented your own solution, discovering Jetpack Navigation now can be beneficial as it provides a standardized and well-supported approach to navigation in Android apps.

By adopting Jetpack Navigation, you can leverage the benefits of the Navigation component, such as its integration with Android Studio, support for back navigation, deep linking, and navigation argument handling. It also simplifies navigation-related code and allows you to visualize and manage the navigation flow using the XML navigation graph.

Even though you’ve already implemented your navigation system, migrating to Jetpack Navigation might still be worthwhile if you’re looking to enhance your codebase, leverage the official Android approach, and take advantage of the features and improvements offered by the Navigation component.

If you decide to migrate, you can use your existing code and knowledge as a foundation to understand the navigation flow and translate it into the Jetpack Navigation XML graph structure. You can then gradually update your code to use the Navigation component’s APIs and benefit from the additional features it provides.

Ultimately, Jetpack Navigation aims to simplify and standardize navigation in Android apps, making it easier for developers to manage and maintain their app’s navigation flow.

You

So let’s say I’m moving to jetpack navigation are my separate screens (or fragments as it calls them) still represented by classes?

ChatGPT

Yes, in Jetpack Navigation, each screen or destination is typically represented by a Fragment class. Fragments are a fundamental component in Android app development that represent a portion of the UI or a behavior within an activity.

When using Jetpack Navigation, you would create separate Fragment classes for each of your screens or destinations. These Fragment classes encapsulate the logic, UI, and lifecycle management of the respective screens.

Here’s an example of how you would define a Fragment class for one of your screens:

class MenuFragment : Fragment() {
    // Fragment initialization and setup

    override fun onCreateView(
        inflater: LayoutInflater,
        container: ViewGroup?,
        savedInstanceState: Bundle?
    ): View? {
        // Inflate the layout for this fragment
        return inflater.inflate(R.layout.fragment_menu, container, false)
    }

    // Other fragment lifecycle methods, event handling, etc.
}

In this example, MenuFragment is a Fragment class representing one of your screens. Within this class, you would typically override lifecycle methods such as onCreateView() to inflate the layout for the fragment and set up any UI or event handling logic specific to that screen.

You would have separate Fragment classes for each of your screens or destinations, such as SplashFragment, HelpFragment, PlayIntroFragment, PlayFragment, and ScoresFragment. These Fragment classes define the behavior and appearance of the respective screens.

In the navigation graph XML, you reference these Fragment classes by their IDs, which allows the Navigation component to instantiate and manage the appropriate Fragments as you navigate through your app.

By using Fragment classes, you can encapsulate the logic and UI for each screen in a modular manner, making it easier to manage and maintain your app’s navigation flow.

You

I don’t actually understand what an inflator or inflation are.

ChatGPT

I apologize for the confusion. Let me clarify the concepts of inflater and inflation in Android.

In Android, inflation refers to the process of converting an XML layout file into corresponding View objects in memory. It involves creating the appropriate View hierarchy based on the XML file’s structure and attributes. The layout file describes the structure and appearance of the UI elements for a particular screen or component.

The inflater, on the other hand, is a system service or object responsible for performing the inflation process. It is an instance of the LayoutInflater class, which is used to instantiate View objects from XML layouts. The inflater takes an XML layout file as input and returns a corresponding View object hierarchy.

In the context of Fragments, the onCreateView() method is called when the Fragment is being created and displayed on the screen. It is responsible for creating the Fragment’s UI by inflating the corresponding XML layout file.

Here’s an example of using the inflater to inflate a layout within a Fragment’s onCreateView() method:

override fun onCreateView(
    inflater: LayoutInflater,
    container: ViewGroup?,
    savedInstanceState: Bundle?
): View? {
    // Inflate the layout for this fragment
    return inflater.inflate(R.layout.fragment_menu, container, false)
}

In this code snippet, inflater is the LayoutInflater instance provided as a parameter to the onCreateView() method. The R.layout.fragment_menu is the XML layout resource file representing the UI for the MenuFragment. The container parameter is the parent ViewGroup in which the fragment’s UI should be attached (if applicable).

By calling inflater.inflate(), you provide the XML layout resource file to the inflater, which then creates the corresponding View hierarchy. The inflated View hierarchy is returned from the method and will be displayed as the UI for the MenuFragment.

In summary, the inflater is responsible for converting an XML layout file into View objects, and inflation is the process of performing this conversion. The inflated View hierarchy is then used to define and display the UI for the Fragment or any other UI component in an Android app.

You

Thank you. Again that’s very helpful.

Now let’s suppose I need to pass around a PlayerScores object between the different fragments. How does the code look?

ChatGPT

To pass data, such as a PlayerScores object, between Fragments in Jetpack Navigation, you can use navigation arguments. Navigation arguments allow you to define parameters in the navigation graph and pass values between Fragments during navigation.

Here’s an example of how you can define and pass a PlayerScores object as an argument between Fragments:

  1. Define the argument in the navigation graph XML:
<!-- navigation_graph.xml -->
<fragment
    android:id="@+id/playDestination"
    android:name="com.example.app.PlayFragment"
    android:label="Play Fragment">
    <argument
        android:name="playerScores"
        app:argType="com.example.app.PlayerScores" />
</fragment>

In this example, the tag is added within the tag for the PlayFragment. The android:name attribute specifies the name of the argument, and the app:argType attribute specifies the type of the argument, which is com.example.app.PlayerScores in this case.

  1. In the sending Fragment (e.g., MenuFragment), use the navigate() method and pass the PlayerScores object as an argument:
val playerScores = PlayerScores(/* initialize player scores */)
val action = MenuFragmentDirections.actionMenuToPlay(playerScores)
findNavController().navigate(action)

In this code snippet, MenuFragmentDirections.actionMenuToPlay() is an auto-generated class that provides a static method actionMenuToPlay() to create the action for navigating from MenuFragment to PlayFragment. The playerScores object is passed as an argument to the action.

  1. In the receiving Fragment (e.g., PlayFragment), retrieve the argument value:
val args: PlayFragmentArgs by navArgs()
val playerScores: PlayerScores = args.playerScores

In this code snippet, navArgs() is a property delegate that retrieves the arguments associated with the PlayFragment. By accessing args.playerScores, you can retrieve the PlayerScores object passed from the sending Fragment.

By utilizing navigation arguments, you can easily pass complex objects or other types of data between Fragments in Jetpack Navigation. It helps maintain a clear and consistent approach to data passing within your navigation flow.

Note: To use navigation arguments, ensure that you have the Navigation component dependencies configured in your project’s build.gradle file.

You

How do ensure I have Navigation component dependencies in my build.gradle file?

ChatGPT

To ensure that you have the necessary Navigation component dependencies in your project’s build.gradle file, you need to make the appropriate additions and configurations. Here’s a step-by-step guide:

  1. Open your project’s build.gradle file (the top-level build.gradle file, not the module-level one).
  2. Ensure that you have the google() repository included in the buildscript repositories section. It should look similar to this:
buildscript {
    repositories {
        google()
        // Other repositories...
    }
    // Other configurations...
}
  1. In the dependencies block of the buildscript, add the classpath dependency for the Navigation component. It should look similar to this:
buildscript {
    dependencies {
        classpath "androidx.navigation:navigation-safe-args-gradle-plugin:2.4.0"
        // Other dependencies...
    }
    // Other configurations...
}

Make sure the version number (2.4.0 in the example) matches the desired version of the Navigation component. You can check the latest version on the Google Maven Repository.

  1. Open the module-level build.gradle file (usually located in the app directory).
  2. In the plugins section, apply the Navigation safe args plugin. It should look similar to this:
plugins {
    id 'com.android.application'
    id 'kotlin-android'
    id 'androidx.navigation.safeargs.kotlin'
    // Other plugins...
}

Applying the androidx.navigation.safeargs.kotlin plugin enables support for navigation arguments in Kotlin.

  1. In the dependencies block of the module-level build.gradle, ensure that you have the necessary dependencies for the Navigation component. Add the following lines:
dependencies {
    // Other dependencies...
    implementation "androidx.navigation:navigation-fragment-ktx:2.4.0"
    implementation "androidx.navigation:navigation-ui-ktx:2.4.0"
}

These dependencies provide the necessary libraries for Fragment and UI-related navigation components in your app.

  1. Finally, sync your project with the Gradle files by clicking on the “Sync Now” button or by running a Gradle sync operation.

Once you have added the required dependencies, your project should be ready to utilize the Navigation component, including navigation arguments.

Remember to replace the version number (2.4.0) in the example with the desired version you intend to use.

By ensuring the proper dependencies are added and configured, you can leverage the capabilities of the Navigation component in your Android app.

Cardigan Bay 0.8.0

There’s a new Cardigan Bay out.

See the video

And get it at : https://github.com/interstar/cardigan-bay/releases/tag/0.8.0

What’s new :

  • Card-level editing. The card-bar (which opens at the bottom of a card) now lets edit the source code of the card directly.
  • Workspace saving. In a the ClojureScript workspace there’s now a button to save your work immediately.
  • Media library card : :system now has a command to make the card list the files in the media library.
  • Transclusion of cards from one page to another working (and officially supported)
  • Network diagrams : experimental. It now renders a kind of network diagram. No editor yet. (But in progress).
  • Nicer UI

ChatGPT

Two stories about ChatGPT

Two days ago I tweeted :

I confess. I’ve succumbed and subscribed to ChatGPT Plus.

Eufrasio asked me why, so I replied

It’s great for translating code from one programming language to another. For example, I had a version of Gbloink! running in the browser, I wrote in CoffeeScript 10 yrs ago. There’s no future in CoffeeScript, but just translated it to Python, with GPT doing about 90% of the work

This is true. I’m hoping that I can release a new Python based version of Gbloink! this year. The Python is running in the browser thanks to PyScript But I should also be able to use it for a desktop app. I may even look into Kivy

As an aside if you haven’t seen them, you might be interested to see that I’m actually having a lot of success now with the original (1997 era!) Gbloink! running with modern DAW software :

Anyway, on the main topic of ChatGPT, Abraham asked

How’s it looking so far? Worth every penny?

And I answered :

Currently (since payment) it’s given me some code to add a) auto-complete to the “move card” field in Cardigan Bay. And b) to check broken links in a page.

I haven’t integrated them in yet, but I hope to in the next couple of days. If they work it’s a definite productivity boost.

Which brings us to the second story, as I tweeted today :

Damn! OK. So #chatgpt just bullshitted me (or at least helped reinforce rather than correct my own stupidity)

I wanted to write a script to check broken links out of my wiki. And I got Chat to help me write it in Clojure.

Of course, I was thinking it would be nice to have this running in the browser. A handy component at the bottom of pages that checked all the external links on them to tell me if they were still valid.

So after some back and forth, it seems like me and chat got the code working.

And then I hit the CORS problem …

Of course you can’t run this in the browser. And when asked directly, #chatgpt KNOWS you can’t.

But it’s happily accompanied me along the path of writing a nice interactive Reagent based widget for doing all this work in the browser.

Now I thought I was smart enough to use #chatgpt and NOT be caught out by its bullshit.

And I now realise that’s bollocks.

Yes, I can totally be fooled by it.

I hope I’m wiser now.

Which I think is a good summary of where I am.

Yes, ChatGPT is an incredible productivity boost. And, it’s even good for “rubber ducking”, ie. having “someone” to talk a problem through with. I’m not regretting subscribing. BUT … you ALWAYS have to be alert that it can confidently assert things that aren’t true or don’t make sense.

Current AI is Phenomenal

It really is.

I’ve been completely blown away by the progress that AI has made recently.

Firstly, like most people, I became obsessed with AI Painting programs like DallE and Stable Diffusion. I even decided to become the world’s most unlikely fashion brand, by using AI painting to express a certain fetishism for synthesizers, DJ decks and other toys combined with anachronistic geeky humour.

So you too can wear Gargoyle DJs on a gothic cathedral.

https://i.etsystatic.com/38454376/r/il/48feb1/4393582573/il_1588xN.4393582573_pocw.jpg

And medieval nuns discussing a Moog.

White Unisex Crewneck T-shirt with Medieval Nuns image 4

And carry a Victorian DJ tote bag

Tote Bag : Victorian DJ image 1

And admire pictures of pieces of fantasy gear like this:

Make Noise Da Vinci Buchla Acrylic Print 30x40 cm / 12x16″ - Horizontal

But even while I was marvelling at the images that AI was producing, I was still sceptical that AI would make much of a dent in programming.

I wasn’t wowed by CoPilot.

But now ChatGPT has, indeed, blown me away. Not sure if it’s just because it has a better UX. Or if it’s really just cleverer than CoPilot.

But it is able to do things like this: I asked it for some file-uploader code in Clojure / ClojureScript (I really need to add an image gallery to Cardigan Bay). Here’s the response:

Sure, here is an example of how you could allow users to upload pictures to a gallery in your app using Ring and ClojureScript:

On the server side, using Ring, you could create a route that accepts file uploads and stores them in a directory on the server. Here is an example:


(defn upload-route
  "Handler for uploading files to the server. Accepts a file parameter
   and saves the file to the specified directory."
  [request]
  (let [upload-dir "./uploads"
        file (:file request)]
    (if (not (empty? file))
      (do
        (spit (str upload-dir file) (:content file))
        {:status 200
         :headers {"Content-Type" "text/plain"}
         :body "File uploaded successfully"})
      {:status 400
       :headers {"Content-Type" "text/plain"}
       :body "Error uploading file, no file specified."})))

On the client side, using ClojureScript, you could create a form that allows the user to select a file to upload, and then submits the file to the server using an AJAX request. Here is an example:

(ns my-app.core
  (:require [goog.net.XhrIo :as xhr]
            [goog.events :as events]
            [goog.dom :as dom]))

(defn upload-file
  "Handler for uploading a file to the server. Accepts a file input element
   and sends the selected file to the specified URL using an AJAX POST request."
  [file-input url]
  (let [file (.-files file-input)
        form-data (js/FormData.)]
    (if (not (empty? file))
      (do
        (.append form-data "file" (aget file 0))
        (xhr/send url
                  (fn [event]
                    (let [xhr (.-target event)]
                      (if (= (.-status xhr) 200)
                        (dom/setTextContent
                          (dom/getElement "upload-status")
                          "File uploaded successfully")
                        (dom/setTextContent
                          (dom/getElement "upload-status")
                          "Error uploading file"))))
                  "POST"
                  form-data
                  {"Content-Type" "multipart/form-data"}))
      (dom/setTextContent
        (dom/getElement "upload-status")
        "Error uploading file, no file selected."))))

(let [file-input (dom/getElement "file-input")
      upload-form (dom/getElement "upload-form")]
  (events/listen upload-form
                 events/EventType.SUBMIT
                 (fn [event]
                   (events/preventDefault event)
                   (upload-file file-input "/upload"))))

Note that this is just an example and may not include all the necessary code to fully implement file uploads in your app. You may need to add additional error handling, security checks, and other features to make it work for your specific use case.

OK. So I haven’t tried this yet. Maybe it’s riddled with bugs. But eyeballing it, it looks like a pretty good first draft. I mean, it’s as good as I’d get by googling an example and copying and pasting some code from it. And it took 30 seconds to generate, rather than maybe half an hour concentrated googling. (Which would probably be a couple of hours in terms of finding a window to do it, getting into the zone and concentrating etc.)

So … what do I think now? I’m gonna try this. And if it works … really I’m changing my mind drastically. I thin AI is really going to internalize enough about the structure, syntax, inner coherences of code, that AI based tools are about to completely transform and revolutionise coding.

Which is scary. But also very exciting given how much frustration there still is in software development.

Why I’m not here much

I’m not writing much here on the SDI blog at the moment.

That’s partly because ThoughtStorms is now going so well. Cardigan Bay keeps improving as a tool for helping me knock it back into shape. The wiki / digital garden is still full of cruft and broken links and confusion of course. But also a lot of new stuff. And with Cardigan Bay always running on my local machine, it’s very easy to capture any ideas, notes or interesting links I discover, in that format rather than over here.

That doesn’t mean I’ve given up on this blog though. It just means I need to find new ways to connect the two together.

Also, I’ve been too heavily engaged on Twitter these days. But with the Musk takeover and other issues there, there seems to be real energy for people to get back to the Fediverse and use things like Mastodon. So I’m trying to keep more of my focus and attention there.

My Mastodon ID is @interstar@artoot.xyz

Starting a ClojureScript Project in 2022

I needed to start a simple ClojureScript project in 2022. And it was a faff.

I love Clojure. But creating projects has become much harder because of so many different ways to do it, different and changing tools, and outdated online documentation.

So tl;dr:

I seem to have found a solution that works for me. August 19, 2022

Step 1: Make sure you have up-to-date Clojure CLI tools.

I had Clojure 1.10.1 and none of the tutorials I found worked; and just got incomprehensible error messages.

If you ever see an error message like :

-T is no longer supported, use -A with repl, -M for main, or -X for exec

That’s a definite sign you have an outdated version.

I had 1.10.1 because that’s the current Ubuntu version. It’s outdated, so install CLI yourself. You want 1.11.1 (as of time of writing)

I used

curl -O https://download.clojure.org/install/linux-install-1.11.1.1155.sh
chmod +x linux-install-1.11.1.1155.sh
sudo ./linux-install-1.11.1.1155.sh

Check your version with

clj -Mtools

Step 2: Install clj-new

Useful tool or library from Sean Corfield

See https://github.com/seancorfield/clj-new

Step 3 : Using clj-new install from the figwheel-main template.

clojure -Tclj-new create :template figwheel-main :name myname/myapp :args '["+deps","–reagent"]'

Then run the app with

clojure -M:fig:build

I hope this helps anyone else with similar confusion to me.

For a long discussion / pleading / ranting, see https://clojureverse.org/t/starting-a-new-clojurescript-project-in-2022-setup-suggestions/9222/18 and https://clojureverse.org/t/the-further-adventures-of-starting-a-new-clojurescript-project-in-2022/9228 . And thanks to everyone on Clojureverse who took time and patience to help me get this working.

Why I love Clojure

Here’s some code I’m writing for my day-job.

I have an iterator of IScoreObjects (basically things that live on a musical score or timeline).

I want an iterator of only the Notes. Where Notes are one of the things that implement the IScoreObject interface and can live on the score.

Here’s my code.

import java.util.Iterator;

public class NoteIterator implements Iterator {
    /**
     * Takes an iterator of IScoreObjects and returns an Iterator of only the Notes
     */
    Iterator isoIterator;
    Note _next;
    boolean _hasNext;

    public NoteIterator(Iterator isoi) {
        isoIterator = isoi;
        findNext();
    }

    private void findNext() {
        while (isoIterator.hasNext()) {
            IScoreObject n = isoIterator.next();
            if (n.isNote()) {
                _next = (Note)n;
                _hasNext = true;
                return;
            }
        }
        _next = null;
        _hasNext = false;
    }

    @Override
    public boolean hasNext() {
        return _hasNext;
    }

    @Override
    public Note next() {
        Note rv = _next;
        findNext();
        return rv;
    }
}

On the other hand, here’s how I’d do it in Clojure

(defn just-the-notes [score-objects] (filter .isNote score-objects))

This is literally something like a 20:1 ratio in line count. And the Java needs an extra file of its own. That is insane.