Why Functional Programming Matters - higher order functions and lazy evaluation - example

I have been reading Why Functional Programming Matters
Text point outs that using higher order functions and lazy evaluation we can create modular software

In this paper, we’ve argued that modularity is the key to successful program-
ming

Functional programming languages provide two new kinds of glue - higher-order
functions and lazy evaluation. Using these glues one can modularise programs
in new and exciting ways, and we’ve shown many examples of this

Can someone give few small/basic examples in Clojure (Examples in the text are hard to understand for me)
also can you share your experience with creating modular software in Clojure

The book and lectures of Structure and Interpretation of Computer Programs (SICP) do a good explanation of modularity using a Lisp.

The first two chapters of SICP explain how substitution and abstraction can be used to organize a program. Names are used to describe processes and values, and the programmer can refer to the shorter descriptive names that encapsulate or abstract from the lower levels.

Chapter 3 Modularity, Objects, and State shows that the substitution view is limited, we also need a way to make programs “modular”, “so that they can be divided ‘naturally’ into coherent parts that can be separately developed and maintained”:

To a large extent … the way we organize a large program is dictated by our perception of the system to be modeled. In this chapter we will investigate two prominent organizational strategies arising from two rather different “world views” of the structure of systems. The first organizational strategy concentrates on objects, viewing a large system as a collection of distinct objects whose behaviors may change over time. An alternative organizational strategy concentrates on the streams of information that flow in the system, much as an electrical engineer views a signal-processing system.

Two of those strategies described in SICP are:

  1. Viewing the system as a collection of distinct objects that change over time.
  2. Focusing on streams of information that flow in the system.

In the first approach assignment is used to encapsulate internal data and state in objects. When modelling using assignment the interpreter (which is just another program) needs to keep track of the environment of functions and variables. The book shows this it is difficult to program such an interpreter and it’s also hard to reason about programs that have mutable state (since functions can return different results based on the varying environment) which can lead to bugs.

In Chapter 3 Modularity, Objects, and State some examples are given how to modularize your program like this. For example by writing a withdraw object that has a local state and can be used to withdraw funds from separately:

(defn make-withdraw [balance]
  (fn [amount]
    (if (>= @balance amount)
      (reset! balance (- @balance amount))
      "Insufficient funds")))

;; Make-withdraw can be used as follows to create two objects W1 and W2:

(def W1 (make-withdraw (atom 100))) ; use of Clojure's atom for mutable state
(def W2 (make-withdraw (atom 100)))
(W1 50) ; => 50
(W2 70) ; => 30
(W2 40) ; => "Insufficient funds"
(W1 40) ; => 10

Here the order in which operations occur matter.

In the second way to modularize programs using lazy streams, time is decoupled from the order of the events using lazy evaluation. As is written in SICP 3.5.1 Streams Are Delayed Lists:

The basic idea is to arrange to construct a stream only partially, and to pass the partial construction to the program that consumes the stream. If the consumer attempts to access a part of the stream that has not yet been constructed, the stream will automatically construct just enough more of itself to produce the required part, thus preserving the illusion that the entire stream exists. In other words, although we will write programs as if we were processing complete sequences, we design our stream implementation to automatically and transparently interleave the construction of the stream with its use.

As an example of streams SICP first simplifies the earlier make-withdraw function so that it just monitors the bank balance in an account:

(defn make-simplified-withdraw [balance]
  (fn [amount]
    (reset! balance (- @balance amount))
    @balance))

(def W (make-simplified-withdraw (atom 100)))
(W 10) ; => 90
(W 50) ; => 40
(W 40) ; => 0
(W 10) ; => -10

Then this can be alternatively modeled using streams with a “withdrawal processor as a procedure that takes as input a balance and a stream of amounts to withdraw and produces the stream of successive balances in the account”:

(defn stream-withdraw [balance amounts]
  (cons
   balance
   (lazy-seq ; will invoke the body only the first time seq is called and cache result
    (when-let [s (seq amounts)]
      (stream-withdraw (- balance (first s)) (rest s))))))

(stream-withdraw 100 [10 50 40 10]) ; => (100 90 40 0 -10)
(second (stream-withdraw 100 [10 50 40 10]))  ; => 90

;; Note that stream-withdraw can also be written using the higher-order function
;; reductions that takes - as a function and applies it repeatedly and lazily to
;; the starting value 100:
(defn stream-withdraw [balance amounts]
  (reductions - balance amounts))

From the perspective of the user this functional model can have the same behaviour as the modelling with objects, yet there is no assignment and no local state variable, yet the system has state.

As SICP describes modelling with objects is powerful and intuitive, but it raises problems with ordering of events and synchronizing multiple processes. The possibility of avoiding these problems makes functional programming languages attractive, which have no assignment or mutable data, which is attractive for dealing with concurrent system. They can, however, also have time-related problems creeping in.

Checkout SICP to gain more clarity on higher-order functions and lazy evaluation.

2 Likes

Clojure does not have the same kind of laziness that haskell has. In Haskell, every expression is lazily evaluated. In Clojure, only sequences are lazily evaluated. But you can still see the same kind of effect if you stick to sequences.

Here’s an example that I’ve done. It separates how to query the database from how many items you need from the database:

;; query-db-table reads one page of 10 items from the db
;; read-db returns a lazy seq of all items from query-db-table starting at a page
(defn read-db 
  ([]
    (read-db 0)
  ([page]
    (when-some [results (query-db-table page)]
      (lazy-cat results (read-db (inc page))))))

Then somewhere else I decide how much I actually need:

;; return the first items whose total cost exceeds x dollars
(defn take-first-x-dollars [items dollars]
  (lazy-seq
    (cond
      (empty? items)
      nil

     (not (pos? dollars))
     nil

      :else
      (cons item (take-first-dollars (rest items) (- dollars (:cost item))))))

It’s pretty contrived, but it shows how you can separate figuring how to read the items from the database from how to calculate how many items you’ll need.

I’ve yet to study Clojure properly, but maybe this simple demonstration will help.

(comment
  ; Glue #1: reduce, a higher-order function
  ; This is sum
  (reduce + 0 [1 2 3 4 5])
  15
  ; This is product
  (reduce * 1 [1 2 3 4 5])
  120

  ; Glue #2: function composition
  (defn doubled  [x] (* 2 x))
  (defn one-more [x] (+ 1 x))
  (def glued (comp doubled one-more))
  (glued 4) ; we add 1 to 4, then double the result
  10

  ; Other example: efficient computation over an infinite list (lazy evaluation)
  ; Advantage: we think in abstract terms
  (take 3 (map doubled (range)))
  (0 2 4)

  :rfc)

As you can see product and sum represent very different computations (conceptually), but both implementations share a lot of similarities. So we can reuse code and that’s glue #1.

For glue #2, we compose functions, we combine them a bit like legos : they snap together if compatible. That’s good, again, from a code reuse perspective.

The last example shows how we can apply a computation to a lazy (infinite) list.

That’s desirable because if you can represent your program as a series of operations applied to a lazy list, then your program definition could become:

(reduce my-program init future-data)

Which has all sorts of interesting properties.

That’s my current understanding of FP, I hope that was on topic and not too basic.

Not to derail you, but If you want to understand that paper, and the notation that goes with it, I can recommend reading the book “Programming in Haskell”, by Graham Hutton. The book’s content feels very close to the content of the paper, which is heavily Haskell influenced by the looks of it.

1 Like