Skip to main content

Extending Spyce with spyceModule

One of the ways to extend Spyce is with a "Spyce module." This is a historical term, and a little unfortunate because some people have assumed when I talk about "modules" I automatically mean the Spyce variety rather than the vanilla python variety.

A Spyce module is simply a class that extends spyceModule.spyceModule. That's it.

Spyce modules may be used in a .spy page with

[[.import name="modulename"]]

which instructs the compiler to create code that instantiates an instance of the given spyceModule at the beginning of each request for this page. It also automatically invokes the instance's finish method when the request finishes.

What can you do with Spyce modules? A common reason to write a Spyce module -- perhaps the most common -- is is to provide some sort of resource pooling. Here's the spydurus module which does exactly that:

spydurus.py

CONNECTIONS = 3 # max connections to put in the pool

import Queue
from spyceModule import spyceModule

from durus.client_storage import ClientStorage
from durus.connection import Connection

q = Queue.Queue()
for i in range(CONNECTIONS):
    q.put(Connection(ClientStorage()))

class spydurus(spyceModule):
    def start(self):
        self.conn = None
        try:
            self.conn = q.get(timeout=10)
        except Queue.Empty:
            raise 'timeout while getting durus connection'
        else:
            self.conn.abort() # syncs connection

    def finish(self, err):
        q.put(self.conn)

That's it! (Well, minus some convenience methods dealing with self.conn that are elided for concision here.) Note that finish is always called, even if your code raises an exception, even if you redirect to another page, even if another module's finish method raises.

Most modules can also be usefully used as a python module -- for instance, tododata.py in the Spyce to-do demo uses spydurus.q to get one of the pooled connections in a non-.spy context.

spydurus doesn't need it, but spyceModule also provides a hook into the spyce server in the form of self._api. You can do just about anything from this. I've seen a module that implements cgi_buffer functionality, a module that provides a Struts-style controller, and more. And of course there's all the standard modules.

(Updated links to point to the sf.net site.)

Comments

Popular posts from this blog

The Missing Piece in AI Coding: Automated Context Discovery

I recently switched tasks from writing the ColBERT Live! library and related benchmarking tools to authoring BM25 search for Cassandra . I was able to implement the former almost entirely with "coding in English" via Aider . That is: I gave the LLM tasks, in English, and it generated diffs for me that Aider applied to my source files. This made me easily 5x more productive vs writing code by hand, even with AI autocomplete like Copilot. It felt amazing! (Take a minute to check out this short thread on a real-life session with Aider , if you've never tried it.) Coming back to Cassandra, by contrast, felt like swimming through molasses. Doing everything by hand is tedious when you know that an LLM could do it faster if you could just structure the problem correctly for it. It felt like writing assembly without a compiler -- a useful skill in narrow situations, but mostly not a good use of human intelligence today. The key difference in these two sce...

A week of Windows Subsystem for Linux

I first experimented with WSL2 as a daily development environment two years ago. Things were still pretty rough around the edges, especially with JetBrains' IDEs, and I ended up buying a dedicated Linux workstation so I wouldn't have to deal with the pain.  Unfortunately, the Linux box developed a heat management problem, and simultaneously I found myself needing a beefier GPU than it had for working on multi-vector encoding , so I decided to give WSL2 another try. Here's some of the highlights and lowlights. TLDR, it's working well enough that I'm probably going to continue using it as my primary development machine going forward. The Good NVIDIA CUDA drivers just work. I was blown away that I ran conda install cuda -c nvidia and it worked the first try. No farting around with Linux kernel header versions or arcane errors from nvidia-smi. It just worked, including with PyTorch. JetBrains products work a lot better now in remote development mod...

Python at Mozy.com

At my day job, I write code for a company called Berkeley Data Systems. (They found me through this blog, actually. It's been a good place to work.) Our first product is free online backup at mozy.com . Our second beta release was yesterday; the obvious problems have been fixed, so I feel reasonably good about blogging about it. Our back end, which is the most algorithmically complex part -- as opposed to fighting-Microsoft-APIs complex, as we have to in our desktop client -- is 90% in python with one C extension for speed. We (well, they, since I wasn't at the company at that point) initially chose Python for speed of development, and it's definitely fulfilled that expectation. (It's also lived up to its reputation for readability, in that the Python code has had 3 different developers -- in serial -- with very quick ramp-ups in each case. Python's succinctness and and one-obvious-way-to-do-it philosophy played a big part in this.) If you try it out, pleas...