I wrote a post about setting up Gitweb on Mac OS X 2 years ago. Today I was reading it again to setup gitweb on another Mac and the steps have slightly changed.
It assumes that Git is installed in /usr/bin (otherwise, modify the bindir property accordingly). You just need to modify GITWEB_PROJECTROOT to point to the root of your Git projects.
To install gitweb (from Git 1.7.2.1) on Mac OS X 10.6.4, the steps are now:
# define the projects root
export GITWEB_PROJECTROOT="/Users/jmesnil/Git/"
# retrieve the latest version of git
git clone git://git.kernel.org/pub/scm/git/git.git
cd git/
# install gitweb.cgi
sudo make GITWEB_PROJECTROOT=$GITWEB_PROJECTROOT \
GITWEB_CSS="/gitweb/gitweb.css" \
GITWEB_LOGO="/gitweb/git-logo.png" \
GITWEB_FAVICON="/gitweb/git-favicon.png" \
GITWEB_JS="/gitweb/gitweb.js" \
bindir=/usr/bin \
gitwebdir=/Library/WebServer/CGI-Executables/gitweb \
install-gitweb
# move the static files to the proper folder
mkdir -p /Library/WebServer/Documents/gitweb
cd /Library/WebServer/
mv CGI-Executables/gitweb/static/* Documents/gitweb
I intend to change a bit the way I write on this weblog to link more often to other URLs of interest (about code, photos, etc.). Based on the DF-Style Linked List plug-in, I now distinguish between my own posts and simpler link posts.
This is similar to Daring Fireball or James Duncan Davidson's blog. I find this more convenient: when reading from a news reader, cliking on the title of a link post will directly open the linked web site instead of going to mine for an useless roundtrip. Link posts are identified by appending a ☛ to the post title (in both the feed and the web pages).
For longer, more personal posts, I differentiate them by prepending a ⚑ in the feed title.
This way, I clearly distinguish between my own writings (⚑) and others' writings that I link to (☛).
If you read this using a feed reader, you may not have noticed but I also updated the look of my web site. I streamlined it by using the simplest Wordpress theme I could find and switching to the Helvetica Neue font. The simpler, the better...
I don't really understand what this score means and how it is measured but apparently it confirms that the D7000 is a good camera. I will not take better pictures because of it but I am no longer able to blame the camera for the photographer's deficiencies! :)
This weekend I upgraded my camera and bought the new Nikon D7000 body. I did not play much with it but it looks like a winner.
A the moment, Aperture 3 does not yet support this camera. As I am waiting for an update to be able to process RAW pictures, I am shooting in RAW + JPEG. This way, I can already see the JPEG and when the update comes, I will be able to work on the RAW files.
However it is also the opportunity to change a bit my workflow. Until now, I shot everything in RAW but I wonder if this is not overkill. On a whole batch of pictures that I take, I delete most of them after importing in Aperture, keep a few only and post-process even fewer.
Most of my pictures could be stored as JPEG instead of unprocessed RAW pictures without any quality loss (I like the look of the JPEGs straight out of the D7000). Nikon D7000 has a 16Mpix sensor and shooting RAW is consuming a lot of disk space (19.6MB for RAW against 5MB for the corresponding fine JPEG).
Ideally, I'd like to shoot using RAW + JPEG, keep the JPEG for most of the photos and use RAW only for those that are worth it or need post-processing.
Using Aperture 3, I am settling on that workflow:
Import both RAW + JPEG as Separate Masters
Auto-Stack (⌥⌘A) them with a short time span (e.g. 0:02) to keep them together
Browse all the pictures and reject either the RAW (if the JPEG is good enough) or the JPEG (to keep the RAW for further post-processing)
Aperture 3 can also import RAW + JPEG as a single resource but there is no simple way to keep only one of them afterwards (without going through a whole new export/import sequence). Using stacks to group a JPEG and a RAW is a workaround. It does not work when shooting fast sequences but it is a non-issue: in that case, I shoot JPEG only to maximize frame rate.
As long as Aperture 3 does not support D7000 camera, I will keep both RAW and JPEG pictures and will revisit them later when it will be updated.
I am not completely sold on my workflow: consuming less disk space does not seem a good reason to change it. I could as well increase the size of my MacBook's HDD but I am also pondering to switch to a SSD which would likely be smaller that my current HDD. On the other hand, it feels overkill to keep everything in RAW when I process only a tiny fraction of my pictures and am happy with the quality of the D700's JPEGs.
It will be interesting to reconsider my workflow after some time to see what is the ratio of RAW vs. JPEG that I am keeping.
Pearl Jam isn't the first veteran rock band to see a decrease in fans as it got older. But it's the best example of a band deliberately expediting the process. Pearl Jam helped to set a template that all too many alt-rock bands would follow in the '90s: success, and then retreat. Make people love you, and then disengage. Get to a certain level, and just stop.
Pearl Jam feels like a band who enjoys playing together and have fun with their fans, not seeking media attention. I went to hear them at Paris on a September 11th a few years ago (for their Advocado album) and it was the best rock concert I ever went to.
For a iPhone application I am writing on my spare time, I have added a MKMapView which displays a small map with a fixed coordinate. The MKMapView does not allow user interaction, it is for information purpose only, to give some geolocation context. However, I want the user to be able to tap the map view to open it full screen and interact with it (zoom, scroll or open it in Maps). The idea is very similar to geolocated tweet in Twitter for iPhone.
The issue I encountered is that MKMapView captures user events and there is no way to retrieve them (subclassing MKMapView is discouraged). I searched for a solution on internet to know when the user taps the MKMapView but none of the solution were simple enough (or elegant enough) to my liking.
I ended up solving it using a UIView and a block.
The basis of the solution is to add a UIView on top of the MKMapView to handle the touchesBegan:withEvent: method when the user taps on the map view (or so he thinks...). When this UIView's touchesBegan:withEvent: method is called, we call a block passed by my main UIViewController. This allows to keep all the logic inside the main view controller.
This solutions requires few lines of code and no changes to the NIB file.
In the NIB file, I just placed a MKMapView (with zoom and scroll disabled) that is connected to its controller through an IBOutlet. I added a mapTouchView which is not an IBOutlet, it will be created and configured directly from the code.
The MapTouchView's initialization method takes a block that will be called when its touchesBegan:withEvent: method is called. In this case, when that happens, MyController will call its own displayFullMap method.
In MyController's viewDidLoad, we make sure to insert the mapTouchViewabove the mapView:
With these few lines, we have displayed an invisibleUIView on top of the MKMapView. To verify this, you can change the background color of mapTouchView in viewDidLoad:
We keep a block attribute which is set in the initialization method and will be called when the user touches the view. The implementation is straightforward too:
When the initialization method is called, we copy the block passed in argument. When the touchesBegan:withEvent: method is called, we call the block passing the method's arguments.
In my controller's case, I don't need any of these arguments but I preferred to have the block's argument match the event method arguments so that this MapTouchView can be reused in other contexts which requires them (to know the tap count for example).
This solution is simple (and reusable), all the logic remains in MyController and MapTouchView does not contain any code specific to my application. It would be possible to make MapTouchView even more generic (e.g. have the block return a BOOL to determine if the event must be passed to the map view, implement other event methods, etc.) but it solves my problem as it is.
It was also the opportunity to use blocks in iOS and it is a great new tool that I have added to my toolbox to write simpler, more elegant code.
In my previous post, I complained that JMX API was not consistent about the return types of DynamicMBean's getAttribute() and getAttributes() methods.
Eamonn McManusreplied with a very good explanation on why getAttributes() is designed that way in order to be consistent with setAttributes(). This bit of "ugliness" in the API is a very good compromise. It requires some attention from the API user but it helps tremendously developers providing MBeans as they don't have to deal with the atomicity of setting different attribute values. I would still object that getAttribute() should have been consistent and return an Attribute instead of the value. But I can understand that the Attribute indirection has no purpose in that case.
I like JMX API for that reason: it's both simple and flexible. There are some parts which are ugly or require more work than I would like (writing an OpenMBean is a chore with too much information that should be handled by the JVM or JMX library) but I understand that JMX is widely used and backwards compatibility is a mandatory requirement. If every new release of Java was coming with a refined JMX API without a regard to backwards compatibility, I would not praise it like I do.
Writing an API both simple and empowering is hard. Support it over several releases is a tremendous task.In the Java world, Eclipse is the front leader on that task.
Several years ago, I developed a plug-in, eclipse-jmx, to manage JMX application inside Eclipse (it is also integrated into JBoss Tools as its JMX tool). And release after release, it is still supported by Eclipse without any changes to the API calls.
The last release of eclipse-jmx was written against Eclipse 3.2 and 3 years and 4 Eclipse releases later, eclipse-jmx still works fine, you can even install it using Eclipse Marketplace. I have a love/hate relationship with Eclipse as I use it everyday but I have only admiration for its developers who have been able to craft such a software and took on themselves to support APIs over long time and multiple versions instead of pushing all the work to the contributors to constantly update their plug-ins for each new release. The beauty of Eclipse APIs is in their durability and resiliency.
If you want to know how to write great Java API, the best resources to start are Eclipse and JMX.
On a similar topic, I am having a deep look at HTML5 and its various JavaScript APIs (WebSockets, geolocation, storage, canvas, etc.). I am curious to see how they will evolve and if lessons will have been learnt from the evolution (or lack of) of the DOM API...
jmx4r 0.1.2 has just been released (jmx4r is a JRuby library which makes it super easy to write simple Ruby scripts to manage Java & JRuby applications using JMX).
it fixes a bug in the getAttributes() method that is exposed by Ruby objects extending jmx4r's DynamicMBean class
I introduced the bug but it is one of the rare cases where I would still blame JMX API. Its DynamicMBean interface has two methods:
getAttribute(String attribute)
getAttributes(String[] attributes)
Following the principle of least surprise, I expected getAttribute(String) to return the value of the attribute and getAttributes(String[]) to return a collection of the attribute values. But that's not the case. As expected getAttribute(String) returns the value of the attribute but getAttributes(String[]) returns a AttributeList which is a list of Attribute objects (which in turns contains the name and value of an attribute). jmx4R 0.1.2 fixes the issue and make sure the methods return the correct objects.
JMX is one of my favorite Java API, simple, flexible and powerful at the same time. Its evolution made it both a bit uglier and more convenient (I wish open types were more simpler to use and less verbose to declare). Providing a library to use JMX in JRuby applications is the perfect match between a simple and powerful library (JMX) and a simple and powerful language (Ruby) on a simple and powerful platform (JVM).
From time to time, I receive mails inquiring about the status of jmx4r. I no longer use jmx4r in my daily job and it is not actively developed. However it is actively maintained. If you find bugs, I can release a new version quite fast. jmx4r users are awesome and most of the time, they provide patches when they report bugs. If you have ideas for improvements, do not hesitate to send me a mail or fork the project. Most recent contributions were done through forks on GitHub.
Once again, thanks to all the contributors and users who help make jmx4r even more useful!
As usual, to get this new release, just update the rubygem:
With the releases of iOS 4 and a developer preview of Xcode 4, I am doing some Mac development again.
As I have a few ideas I want to try on iOS 4, I had to renew the Apple ransomtax developer license to be able to put my own code on my own device but I digress...
I took the opportunity to update TangTouch (iTunes link), the game I wrote to learn iPhone development, to support iOS 4.
There has been several releases of the iOS (ne iPhone OS) since I wrote this application but the APIs are fairly stable and I had to change a few deprecated calls only. As a bonus, it now supports multitasking: if you leave the game and come back later, the game will be as you left it. I released the new version on the App Store. This game was a learning tool and it shows a lack of polish (I am by no means a graphic designer). The game is free so that I don't have to refund it if people do not like it :)
Apple has also released a developer preview of Xcode 4 that I am evaluating. Xcode 4's user interface was redesigned by switching to a single window instead of cluttering the desktop with many windows. The window layout modes are surprising (for example, to display corresponding .m and .h files next to each other) and I have not decided which ones I preferred to use. I still have issues to run Interface Builder inside with XCode 4 ,so I revert to Xcode 3.2 to do some UI work.
I was also interested to see how Git was integrated as I use it for all my personal projects. The user interface to browse a file history is innovative (and fast!), reminding of Time Machine:
At the moment, Git integration is fairly limited. Xcode does not seem to handle Git branches and stashes. I did not find how to browse the whole project either and it does not seem possible to add remote repositories as only the local Git repository is used (no push or pull to remote repositories). I hope that Apple will fully embrace Git and expose in a clean way all its features. At the moment, GitX and the command line are still the best way to interact with Git: I develop first and organize my commits later (to add interactively, merge commit, rebase, etc.).
Coming from a Java background, it is always fun and very interesting to dive into other environments. I appreciate how well-designed Cocoa APIs are (especially compared to so many Java & JEE APIs...) and I like Objective-C even though its incidental complexity is close to Java. It really is the APIs and the developer tools that make it worthwhile to develop on iOS and Mac and have fun while doing it.
Yesterday I was at Joe Cocker's concert and the day before at a concert from the Gospel Institut to celebrate the end of their courses... and the late arrival of summer (even though we had a few drops during the concert).
There was no security to take my camera at that concert and I could grab a few pictures before it was too dark. My 18-200mm lens needs lots of light and the D60 does not give great pictures when pushing in the ISO. I wish I had a 50mm f/1.4 to take pictures late in the concert when the raw emotion was more palpable on the singers' faces.
I particularly enjoyed a beautiful interpretation of Eric Clapton's Tears In Heaven. One of my favourites songs makes for a great gospel full of soul and emotions.
I had a great time at that concert listening to all the singers and musicians and taking pictures but I am disappointed by their low quality. I need to practice more on low light settings and invest in the Nikon 50mm f/1.4 to get better results. The photograph would still be to blame for the unoriginal compositions :)