articles tagged with code

Song of GitHub

no comments yet, post one now

It’s been a while. Before we catch up, let me sing you the song of my contributions.

Update – wow, i’ve just realised that this embedded iframe will play the tune on every visit to this page (a bit annoying). So I’ve created a quick PR for the project to add a “▸ Click to play” button. If (or when) it gets merged, I’ll throw the embedded music box back in here.

Update – the power of open source, thanks to ajacksified for creating this and merging!

May 27, 2013 22:00 by

Suggest subdomain names

no comments yet, post one now

Following on with subdomains, say you want one field to auto-suggest into another. For example, when a user enters their company name (or username, last name etc.) this same value could be used to suggest a valid subdomain. Here’s the javascript and html (the function uses prototype for the element lookup).

The suggestion is based on the facts that valid subdomains should be alphanumeric, with dashes, less than 63 chars and not start or end with a dash.

<label for="username">Username</label>
<input id="username" onkeyup="suggestSubdomain(value, 'subdomain');" type="text" name="username" /><br/>
<label for="subdomain">Subdomain</label>
<input id="subdomain" type="text" name="subdomain" />

<script type="text/javascript">
  function suggestSubdomain(value, element) {
    var subdomain = value.gsub(/\s+/, '-');
    subdomain = subdomain.gsub(/[^a-z0-9\-]/i, '');
    if(subdomain[0] == '-')
      subdomain = subdomain.substr(1, subdomain.length)
    if(subdomain[subdomain.length-1] == '-')
      subdomain = subdomain.substr(0, subdomain.length-1)

    $(element).value = subdomain.substr(0, 63).toLowerCase();
  }
</script>
November 07, 2010 21:42 by

A Lesson In Computational Complexity

1 comment

What is ‘Computational Complexity’ ?

Basically a theory describing the scalability of algorithms. “As the size of the input to an algorithm increases, how do the running time and memory requirements of the algorithm change?” Wikipedia has a more complete description. Complexity can be considered in terms of;

  • Time Complexity – the number of steps that it takes to solve an instance of the problem as a function of the size of the input, using the most efficient algorithm.
  • Space Complexity – the amount of space, or memory required by the algorithm.

If you consider an array with n elements, and in solving a problem on this array, it takes n times n (or n2) steps. You could say the algorithm used had a complexity of n2. Now with different programming languages there may be additional steps in the algorithm necessary to solve the problem. To describe a general level of complexity across any language, the Big Oh Notation is used. So the time-complexity for this algorithm would be ?(n2)

‘?’ basically indicates that we are ignoring any language dependent factors, allowing complexity to be expressed purely in terms of the size of the input.

An Example

if you take a simple problem like, finding the ‘mode’ average on an array. I.e. the most frequently occurring element in a sequence (lets say of numbers) – You could do this an number of ways. Here is a brute force attempt (in Ruby).

def calculate_mode(nums)
  hi_count = 0
  mode = nums[0]
  nums.each do | search_for |
    count = 0
    nums.each do |num|
      if search_for == num
        count += 1
      end
    end
    if count > hi_count
      hi_count = count
      mode = search_for
    end
  end
end

It should be obvious that this algorithm has a complexity of ?(n2) – since every element of the array must be searched by each element to find the highest count value (and hence the mode).

Could it be done any faster? If the array of elements was sorted (ordered numerically) – the mode could be found by finding the longest continuing sequence in the array, that should only take n iterations. Employing a quick sort algorithm first; which can sort (on average) with ?(n log n)

def quicksort(a)
  return a if a.size <= 1
  pivot = a[0]
  quicksort(a.select {|i| i < pivot }) +
      a.select {|i| i == pivot } + 
      quicksort(a.select {|i| i > pivot })  
end

def calculate_mode_fast(nums, time_start)
  # sort array first
  sorted_nums = quicksort(nums)

  hi_count = 0
  mode = nums[0]  
  count = 1
  idx = 1

  sorted_nums.each do | search_for |
   if search_for == sorted_nums[idx]
     count += 1
     if count > hi_count
       hi_count = count
       mode = search_for
     end
   else
     count = 1
   end
   idx += 1 if idx < sorted_nums.length
  end
end

So we could say that this approach has (on average) a complexity of ?(n+(n log n)) which is significantly less that ?(n2)

I’ve posted this code listing which implements both algorithms running to compute the modal average on identical arrays of random integers. You can vary n (the number of elements) to see how each method performs. From the results (below) its clear that the 2nd algorithm is more time-efficient in this case.

calculate_mode
====================
mode => 3 (it occurs 44 times)
array length was 1000
num loops was 1000000
completed in 0.71737

calculate_mode_fast
=========================
mode => 3 (it occurs 44 times)
array length was 1000
num loops was 1040
completed in 0.011247
← (k) prev | next (j) →