I'm trying to recursively crawl through all the available links in a page, and if it validates a working one, pull all of the links from that page and add it to the list to be crawled once the current page is finished. However i think I've hit a problem with using Conj on my sequence of links.
When I run my code it only appears to do the initial list of links i feed in when i first call the function.
(defn process-links
[links]
(if (not (empty? links))
(do
(if (not (is-working (first links)))
(println (str (first links) " is not working"))
(conj (get-links (first links)) links))
(recur (rest links)))))
I'm not quite sure why its not adding the additional items to the list. Can anyone suggest why its doing this?