José and Julia

José Tafla · February 20, 2026

julia

Image source: ChatGPT

Do you remember the 2009 movie where Amy Adams played Julie Powell and Meryl Streep played Julia Child? It has nothing to do with this article. Instead, I just wanted to see how Julia would compare to Python in regards to ease of programming and to performance.

Julia

So long ago I wrote this article about functional programming in Python, which I’ve been updating whenever there is something significant to add or after learning something new. In recent weeks I got closer to the Julia Project, which has enticed my curiosity on how the language would perform.

Please keep in mind, as I alway say, that this is not a thorough comparison, neither a formal benchmark, but just my exploration of a few use cases. Your mileage may vary.

Julia is a dynamic multi-paradigm general-purpose programming language which first appeared in 2012, even though work had begun in 2009.1 There are some who believe it was named after the famous mathematician Gaston Julia, but according to Jeff Bezanson (one of the original designers) it could mean “Jeff’s uncommon lisp is automated.”

This is the version I’m currently using:

versioninfo()
Julia Version 1.12.4
Commit 01a2eadb047 (2026-01-06 16:56 UTC)
Build Info:
  Official https://julialang.org release
Platform Info:
  OS: Linux (x86_64-linux-gnu)
  CPU: 20 × 13th Gen Intel(R) Core(TM) i9-13900H
  WORD_SIZE: 64
  LLVM: libLLVM-18.1.7 (ORCJIT, alderlake)
  GC: Built with stock GC
Threads: 1 default, 1 interactive, 1 GC (on 20 virtual cores)

Before diving in, a few words of advice.

The timings below may be misleading. In a 64-bit installation (just like mine), integers are limited to 63 bits, which makes it impossible to calculate factorials larger than 20. The solution is to use BigInt instead.

An important feature of Julia is multiple dispatch. Depending on the type of values being passed to a function, the language may choose different variants of the function. Let’s use the following example:

multiply(x, y) = x * y
multiply (generic function with 1 method)

Rather straightforward, isn’t it?

multiply(2, 3)
6
typeof(multiply(2, 3))
Int64

When I multiply 2 integers, I get an integer. What if I use a floating point number?

multiply(5, -2.3)
-11.5
typeof(multiply(5, -2.3))
Float64

It does the conversion automatically. What about other data types? Let’s try rational and complex numbers:

multiply(1//2, 5//3)
5//6
typeof(multiply(1//2, 5//3))
Rational{Int64}
multiply(5+3im, 2-6im)
28 - 24im
typeof(multiply(5+3im, 2-6im))
Complex{Int64}

What if I go crazy, or make a mistake?

multiply("Hello,", "world!")
"Hello,world!"
typeof(multiply("Hello,", "world!"))
String

Just like before, it did the conversion automatically… and we now know that the * operator, when applied to strings, concatenates them.

Let’s explore this concept a step further. I declared multiply to be as generic as can be. I’m declaring a new function to work on integers and on floating numbers individually, and then we’ll see how it behaves with other data types.

function tell_me_about(n ::Int64)
    print("Parameter n is of type $(typeof(n)) and its value is $n.\n")
    print("Are you surprised?")
end
tell_me_about (generic function with 1 method)
tell_me_about(12345)
Parameter n is of type Int64 and its value is 12345.
Are you surprised?

A specific response for integers.

function tell_me_about(n ::Float64)
    print("Parameter n is of type $(typeof(n)) and its value is $n.\n")
    print("This shouldn't be a surprise for you.")
end
tell_me_about (generic function with 2 methods)
tell_me_about(1.2)
Parameter n is of type Float64 and its value is 1.2.
This shouldn't be a surprise for you.

A specific response for floating point numbers.

And if you try any other data types, e.g. tell_me_about('x') or tell_me_about(big(0)), you’ll get an error message saying “The function tell_me_about exists, but no method is defined for this combination of argument types.”

Going forward, because we want to enforce numerical computations involving BigInt, we’re declaring the type explicity using the :: operator.

With no further ado, let’s get to the use cases.

Factorial

Julia already has a factorial function, but for this exercise we should recreate it.

Starting with the imperative way:

function factorial1(n ::BigInt)
    result = 1
    for i in 2:(n-1)
        result *= i
    end
    return result
end
factorial1 (generic function with 1 method)

A BigInt can go as far as the eye can see. Just to make it compatible with the article on Python, I’m using 1558 as input.

@time for i in 1:1000
    factorial1(big(1558))
end
  0.962782 seconds (17.11 M allocations: 1.609 GiB, 33.16% gc time)

In Python, the %%timeit command measures the time it takes to execute each iteration, including variation and mean. In Julia, however, the @time macro records the time it took to execute the entire instruction, so please remember to divide it by the number of loops.

Whichever way you look at it, it’s much slower.

And that’s alright! Just remember that we’re comparing the performance of a Python int to a Julia BigInt, which is equivalent to comparing apples to oranges. Also, please keep in mind that Julia has an extensive library for numerical computations, including a factorial function (which is much faster than my creation):

@time for i in 1:1000
    factorial(big(1558))
end
  0.008527 seconds (8.00 k allocations: 1.834 MiB)

Onto the recursive way:

factorial2(n ::BigInt) = n <= 1 ? 1 : n * factorial1(n - 1)
factorial2 (generic function with 1 method)
@time for i in 1:1000
    factorial2(big(1558))
end
  0.880698 seconds (17.11 M allocations: 1.609 GiB, 25.46% gc time)

Finally, tail recursion:

function factorial3(n ::BigInt)
    ftr(n ::BigInt, acc ::BigInt) = n == 0 ? acc : ftr(n-1, n*acc)
    return ftr(n, big(1))
end
factorial3 (generic function with 1 method)
@time for i in 1:1000
    factorial3(big(1558))
end
  0.585918 seconds (7.81 M allocations: 1.571 GiB, 17.07% gc time, 1.38% compilation time)

Recursive calculation was faster than the imperative way and resource allocation was nearly the same, but demanded less garbage collection time. Regarding tail recursion, as of the time of this writing, Julia does not support tail call optimization, but we can see that resource consumption is much lower, which justifies the better performance, even with the additional compilation time overhead.

Fibonacci

Once again, let’s begin the imperative way:

function fibonacci1(n)
    a, b = big(0), big(1)
    for i in 1:n
        a, b = b, a+b
    end
    return a
end
fibonacci1 (generic function with 1 method)

I left the parameter as a generic. There’s no need to be picky, but due to the sizing of integers, I need to enforce the return type to be BigInt, otherwise I couldn’t go beyond n = 92. How well does it do?

@time for i in 1:1000
    fibonacci1(20577)
end
  3.448405 seconds (41.16 M allocations: 18.030 GiB, 17.78% gc time)

I opted for 20,577 as input just to be compatible with the other article.

Let’s make it recursive:

fibonacci2(n ::BigInt) = n < 2 ? n : fibonacci2(n-1) + fibonacci2(n-2)
fibonacci2 (generic function with 1 method)
@time for i in 1:10
    fibonacci2(big(30))
end
  3.544587 seconds (80.78 M allocations: 2.106 GiB, 29.64% gc time)

Painfully slow.

Onto tail recursion:

function fibonacci3(n)
    ftr(n, a ::BigInt, b ::BigInt) = n == 0 ? a : n == 1 ? b : ftr(n-1, b, a+b)
    return ftr(n, big(0), big(1))
end
fibonacci3 (generic function with 1 method)
@time for i in 1:1000
    fibonacci3(1000)
end
  0.063315 seconds (2.50 M allocations: 94.985 MiB, 11.23% compilation time)

Pretty fast, again not due to tail call optimization, but due to not calculating the same value repeatedly.

Monte Carlo

This was not part of the original article but deserves a visit. Here’s a link to my article on Monte Carlo simulations.

is_inside(x ::Float64, y ::Float64) = x*x + y*y <= 1.0

function calculate(iterations ::Int)
    inside = 0
    for i in 1:iterations
        if is_inside(rand(), rand())
            inside += 1
        end
    end
    return inside * 4.0 / iterations
end
calculate (generic function with 1 method)
@time calculate(100_000_000)
  0.340710 seconds





3.14174572

One may say this is an unfair comparison. The Julia environment is loaded and ready, and the overhead is negligible.

And I agree with you. What I did next was to save the functions above to a file and run it. Although there is a mechanism for compiling a Julia program and converting into an executable, I didn’t try that.

And a sample run took a half second.

Yeah, that’s unfair.


Twitter, Facebook