MySQL Proxy ships equipped with a tokenizer, a method that, given a query, returns its components as an array of tokens. Each token contains three elements:
- name, which is a human readable name of the token (e.g. TK_SQL_SELECT)
- id, which is the identifier of the token (e.g. 204)
- text, which is the content of the token (e.g. "select").
SELECT 1 FROM dualwill be returned as the following tokens:
It's an array of 4 elements, each one containing three items.
Armed with this new knowledge, we can try now to catch the UPDATE queries using a tokenizer.
The tokenizer can do some more things, and there are some performance problems to be handled when using tokens. If you tokenize every query, it may take thrice as long as using regular expressions. With long queries, the difference can skyrocket. Tokenizing a query can cost you 10 times more than using a regular expression. The tutorial mentioned below will deal with this issue as well.
local tokens = proxy.tokenize(query)
if tokens['token_name'] == 'TK_SQL_UPDATE' then
print ('this is an update of table ' .. tokens['text'])
This post is part of a set of recipes that will eventually become a long article.
Want to hear more? Attend MySQL Proxy - The complete tutorial at the MySQL Users Conference 2008.