00:20:38 <donri> SourceSpec? why not just use Cabal
16:24:56 <KorriX> hello
16:25:54 <KorriX> i want to use "Sitemap" type to build "implementation independend" hrefs using Blaze and happstack-web-routes. How to correctly derrive ToValue instance to the Sitemap ?
16:30:16 <donri> can't do that with a plain Html type, you need to thread in the sitemap somehow (e.g. as a function argument, or using RouteT, or a Reader monad...)
16:33:06 <KorriX> can template-haskell derrive ToValue ?
16:45:02 <donri> template haskell can only generate code that you could also write out manually
16:47:17 <KorriX> so it is any simple method to make my idea done ?
16:48:05 <donri> depends, i'm not 100% sure what you're after
16:48:19 <KorriX> data Sitemap = PageA | PageB String | PageC String
16:48:19 <KorriX> and in code i want to have:
16:48:19 <KorriX> H.a ! href (toValue PageA)
16:48:23 <donri> you want to be able to write like, a ! href (Post 123)
16:48:26 <donri> =
16:48:28 <donri> ?
16:49:12 <KorriX> equal ?
16:49:15 <donri> typo
16:49:24 <donri> i don't think there's a sane way to do that with *just* the Html type
16:49:54 <donri> myself I have a Reader for an Environment where I put things like an URL building function, the locale etc
16:50:15 <KorriX> can you give me sample code ?
16:50:25 <donri> then I do things like, url <- asks Env.url; a ! href (url PageA)
16:50:52 <donri> https://github.com/dag/kibr/blob/master/src/Text/Kibr/Html.hs
16:51:03 <donri> https://github.com/dag/kibr/blob/master/src/Data/Kibr/Environment.hs
16:51:14 <KorriX> thank you very much
16:51:40 <donri> also see http://happstack.com/docs/crashcourse/AcidState.html#acid-state-advanced
16:52:18 <donri> that's for serverpart and acid-state but the effect is the asme
16:52:21 <donri> same
16:52:40 <donri> also you might consider HSP instead of blaze: http://happstack.com/docs/crashcourse/WebRoutes.html#web-route-hsp
16:53:21 <donri> it allows you to write it like: <a href=PageA>
16:53:37 <KorriX> i better like blaze
16:53:45 <KorriX> is much faster i think
16:54:03 <donri> duno
16:54:15 <donri> HSP has other problems, but blaze isn't perfect either
16:54:48 <KorriX> what kind of problems HSP have ?
16:56:43 <donri> well for example it works via a preprocessor which will make the haskell source somewhat nonstandard, and sometimes conflicts with other similar things e.g. template haskell (currently have to use the old [$bla||] syntax for quasi quotes)
16:57:03 <donri> it also will happily allow you to typo a html element
16:57:14 <donri> with blaze you at least get an error for that since elements are functions
16:58:16 <donri> though blaze still allows you to write invalid html
16:58:27 <donri> for example div >> p
16:58:36 <donri> uh, other way around :D
16:58:52 <donri> or for that matter p >> p
16:59:13 <donri> wait, that's not what i mean, i mean: p $ p
16:59:34 <donri> *newly awake*
16:59:49 <KorriX> so with templating engine are you recommending me ?
16:59:58 <donri> oh, i don't know :)
17:00:09 <donri> my ideal templating system doesn't currently exist
17:01:04 <donri> there's also xhtml-combinators which is like blaze with added type safety for html validity, but it's for xhtml
17:01:05 <KorriX> with are you using mostly ?
17:01:34 <donri> then there's heist which also ensures valid html since it's also a html parser, but that happens at runtime
17:02:05 <donri> i'm using blaze currently
17:03:56 <KorriX> so if hsp have web-routes integration must exists a method to integrate it with blaze ...
17:04:10 <donri> not the way you want it no
17:04:50 <donri> i know from wanting it too :(
17:05:25 <KorriX> so i will have static hrefs
17:05:42 <donri> ?
17:05:47 <KorriX> strings
17:06:04 <donri> no, you can do it like i do in the code i linked
17:06:16 <KorriX> too much work for me
17:06:20 <donri> :)
17:12:51 <KorriX> why when you derivePathInfo via TH it map url starting from capital letter (localhost/Page instead of localhost/page) ?
17:14:17 <donri> yep
17:14:46 <donri> use boomerang if you want to customize the URLs
17:18:45 <KorriX> thanks
17:24:59 <KorriX> hmm ... it don't have sense
17:25:05 <KorriX> boomerang ...
17:25:43 <KorriX> is there any simplier method to change this ?
17:28:34 <donri> i suppose you can write a PathInfo instance yourself
17:29:24 <KorriX> i am lazy, and i don't want to write big 20lines case ... orf
17:29:26 <KorriX> *of
17:35:28 <stepcut> it would be difficult to be more concise and expressive than boomerang..
17:36:21 <KorriX> boomerang can do it without case ... of ?
17:40:25 <KorriX> is there any method to map over all value constructors of type ?
17:42:02 <stepcut> you want to apply a general, rule based transformation to all the constructors to create the showing and rendering functions?
17:43:19 <stepcut> using generics or something?
17:43:33 <KorriX> generics ?
17:43:38 <KorriX> what it is ?
17:47:38 <stepcut> there are a bunch of libraries such as, SYB, syb-with-class, uniplate, regular, etc, which allow you to write functions that can work over arbitrary data types. In general, the functions can inspect the structure and then names of constructures of the data types and then perhaps actions on them
17:49:28 <stepcut> so you could write a function that says, if I have a Constructor like, Name arg1 arg2, I want to render it like, map toLower Name ++ "/" ++ showURL arg1 ++ showURL arg2
17:49:59 <stepcut> and then that would work for any type/constructor
17:50:31 <stepcut> I gotta run
17:58:28 <Lemmih> Still here, stepcut?
19:01:57 <KorriX> is there any tutorial about data generics ?
19:04:57 <alpounet> it depends on the library you wanna use
19:05:55 <KorriX> Data.Generics
19:09:50 <alpounet> so, syb?
19:10:23 <alpounet> have you read the papers behind the design of that library?
19:37:11 <stepcut> Lemmih: I am around.. but about to make some mead
20:23:41 <Lemmih> stepcut: I can't seem to reproduce migration errors when going from 0.5 to 0.6.
20:24:15 <Lemmih> stepcut: If you make test cases, I'll fix 'em.
20:27:54 <stepcut> k
20:28:38 <Lemmih> The log files are versioned so I should be able to write good solutions.
20:38:34 <KorriX> hello 2nd time
20:39:11 <KorriX> does webroutes-boomerang can apply routing method to all of the routers ?
20:39:16 <KorriX> rSomething
22:16:05 <alpounet> aaaalright
22:16:10 <alpounet> let's get back to scoutess, shall we
22:42:10 <alpounet> stepcut, do you have some function to download a file over http or should i write one?
22:51:23 <stepcut> alpounet: one moment
22:53:07 <stepcut> http-conduit is probably ok for now
22:53:09 <stepcut> http://hackage.haskell.org/packages/archive/http-conduit/1.2.4/doc/html/Network-HTTP-Conduit.html
22:53:21 <stepcut> would rather use, http-pipes, but no one has written that yet
22:55:18 <alpounet> so we should add this as a dependency?
22:55:36 <alpounet> that's for fetching package archives from hackage
22:57:35 <alpounet> that also makes us add a whole bunch of dependencies (http-conduit's)
22:58:04 <stepcut> yeah.. that is why I am not thrilled
22:58:16 <stepcut> if you want to do something else, that is fine too
22:58:24 <stepcut> especially if something else == http + pipes ;)
22:58:35 <alpounet> you're the 'pipes' guy
22:58:39 <alpounet> can't steal your role :p
22:59:14 <stepcut> :)
22:59:34 <alpounet> i was thinking about using Network.HTTP or curl, or download, or download-curl
23:01:34 <alpounet> the download and download-curl packages are the one that i would prefer, because of the high level aspect
23:01:42 <alpounet> but they bring unnecessary dependencies in ...
23:08:07 <alpounet> i would have liked a kind of "standalone" code
23:14:34 <donri> meh, dependencies are OK, just make sure to run scoutess on itself ;)
23:16:45 <alpounet> i don't like the idea of depending on tagsoup and feed for... downloading a file over HTTP
23:17:04 <alpounet> there's http://hackage.haskell.org/package/http-wget though that just wraps wget
23:18:20 <donri> just write your own wrapper function and use whatever for now?
23:18:20 <stepcut> sounds good
23:18:42 <stepcut> we would definitely wrap it up so that we can switch it out later
23:20:16 <donri> personally i'm not really rooting for pipes as long as it isn't solving all the issues. got any real complaint about conduits?
23:20:57 <donri> seems to me that conduit is evolving to become more elegant with every new release
23:22:36 <alpounet> i'll just go with Network.HTTP for now i guess
23:22:56 <alpounet> we'll see if we need more later
23:26:07 <stepcut> donri: well, we went start by using IORefs.. I guess there is nowhere to go but up :)
23:27:17 <donri> ^_^
23:27:44 <alpounet> stepcut, what's that srcCacheDir field in SourceConfig ?
23:27:51 <donri> it's the only package solving the issue of scarce resources like file descriptors
23:28:04 <alpounet> in Scoutess.Service.Source.Core
23:28:06 <stepcut> donri: pipes 1.1 have stuff for that as well
23:28:10 <donri> ah
23:28:19 <donri> current pipes is *really crappy for resource management
23:28:38 <stepcut> donri: indeed
23:28:40 <donri> ACTION wishes the pipes author would release more often
23:28:48 <donri> he wants to avoid "spamming hackage" :P
23:29:46 <stepcut> alpounet: just a directory that holds source we have already downloaded. If that directory gets wiped, we can can always recreate it. But saving it between runs lets us do incremental updates
23:29:48 <alpounet> stepcut, does it designate the directory we want to put the retrieved source IN ?
23:29:53 <stepcut> yes
23:30:09 <donri> but. complaining about *past* usage of IORef is like avoiding acid-state because happs[tack]-state used a global state handle. just sayin' :)
23:31:06 <alpounet> stepcut, is it specific to one particular source fetching (like "put the source of pipes-1.1 in that directory") or smth else? i don't quite get the scope of this
23:32:15 <stepcut> yesod: if in doubt, use an IORef :P
23:32:18 <stepcut> alpounet: not sure
23:32:36 <donri> oh, is that a general trend in yesodland?
23:32:41 <stepcut> alpounet: I imagine it would be a directory that looks like, ~/src/darcs/pipes-1.1
23:32:47 <donri> i haven't studied their codes that closely
23:32:52 <alpounet> will, would be useful for me to know, i'm about to add Hackage and Uri fetching with Network.HTTP
23:33:15 <stepcut> donri: ResourceT uses IORefs as well, and there are some IORefs in warp last I looked (but they seemed releated to conduits so maybe that has changed)
23:33:47 <stepcut> so srcCacheDir would be ~/src (which is actually a terrible location now that I think about it)
23:33:58 <stepcut> and then each fetcher would have its own sub directory
23:34:10 <stepcut> and then in the sub-directory there are more subdirectories for each unique source that is fetched
23:34:35 <alpounet> yeah ok
23:34:58 <alpounet> so
23:35:17 <alpounet> say we wanna fetch smth from hackage
23:35:18 <stepcut> "yesod: any problem can be solved given enough IORefs and QQ"
23:35:24 <alpounet> where should it be downloaded to?
23:35:47 <stepcut> I would say.. $srcCacheDir/hackage/package-version
23:36:09 <alpounet> ~/src/hackage/<package>-<version>.tar.gz ? ~/src/hackage/<package>-<version>/<package>-<version>.tar.gz ?
23:36:13 <stepcut> we know that will be unique, since hackage ensures that package-version is always unique
23:36:28 <stepcut> h
23:36:29 <stepcut> hmm
23:36:36 <alpounet> and even another solution
23:36:42 <stepcut> maybe, ~/src/hackage/package/package-version.tar.gz ?
23:36:54 <alpounet> ~/src/hackage/<package>/<version>/<package>-<version>.tar.gz
23:37:08 <alpounet> one thing we have to keep in mind is that when extracing the .tar.gz,
23:37:09 <stepcut> similar to what ~/.cabal/packages looks like ?
23:37:29 <alpounet> we have a directory tree like package/version/ in which you have the cabal file etc
23:37:42 <alpounet> that's how hackage stores them
23:37:52 <stepcut> yeah
23:37:57 <alpounet> how the index generating code (of cabal, and ours too) assumes the directory tree is, etc
23:38:07 <stepcut> yeah
23:39:48 <alpounet> so to kinda stick to the hackage way, we could just store all the .tar.gz in $srcCacheDir/hackage/
23:40:00 <alpounet> BUT, that would mean no consistency at all with the rest of the code
23:40:44 <alpounet> and that would only be an advantage for the index generation code if we perform the index generation thing on THESE copies of the .tar.gz, not ones that we would have moved in a specific directory, created for a build process
23:40:50 <alpounet> not sure i'm being clear here...
23:41:56 <stepcut> I think it is ok for each source fetcher to use it's own scheme
23:42:06 <stepcut> because they have different needs
23:42:16 <stepcut> when fetching from darcs, it does not make sense to have versioned directory names
23:42:49 <stepcut> when we generate the index, we are going to first call, cabal sdist, in all the unpacked source directories, and then copy over the .tar.gz files that get generated I think
23:43:23 <stepcut> and the list of unpacked source directories will be something that is explicitly passed to us.. so we don't need to be able to calculate it in a generic way
23:44:02 <stepcut> We are going to call something like, getSource :: [SourceLocation] -> IO [InformationAboutUnpackedSourceDirectories]
23:45:15 <alpounet> yeah alright
23:45:38 <donri> stepcut: do you know if pipes 1.1 also finalizes resources as *early as possible*, e.g. when composed?
23:46:57 <alpounet> stepcut, so the Hackage fetcher just has to download stuffs to $srcCacheDir/hackage/ ? or should it unpack them too ?
23:47:13 <stepcut> donri: that is the claim
23:47:51 <stepcut> https://plus.google.com/u/0/116312471061608346570/posts/Fan596nt5at
23:48:57 <stepcut> alpounet: it should unpack it as well
23:49:43 <stepcut> alpounet: if you are using a quilt target, (for example), then it is going to want to copy that directory and apply additional quilt patches to it
23:54:23 <alpounet> ok
23:54:53 <alpounet> it discovered there was smth called quilt a few minutes ago by reading the code, so i'm not really close to writing down that fetching :p
23:55:33 <stepcut> :)
23:55:34 <stepcut> no worries
23:58:12 <alpounet> stepcut, will we use pipes 1.1 when released, in scoutess ?
23:58:28 <alpounet> i guess that's what your Control.Pipe.Process code was about