Tag: Crs309
Initial Setup of a MikroTik CRS309 Switch
Setting up a new MikroTik CRS309 turned out to be a bit more complicated than anticipated.
This post outlines my process of setting up the switch. The screenshots show the newer gui as I’m reconstructing this process after setting everything up.
Login to Admin Gui
The MikroTik manual states you need to use a windows program called winbox but I found out,
the switch has a web gui running on port 80, so thats what I used. The switch boots up with
an ip of 192.168.88.1
and a netmask of 255.255.255.0
so you need to set some ip in the
same subnet to access the web gui.
Tag: Mikrotik
Initial Setup of a MikroTik CRS309 Switch
Setting up a new MikroTik CRS309 turned out to be a bit more complicated than anticipated.
This post outlines my process of setting up the switch. The screenshots show the newer gui as I’m reconstructing this process after setting everything up.
Login to Admin Gui
The MikroTik manual states you need to use a windows program called winbox but I found out,
the switch has a web gui running on port 80, so thats what I used. The switch boots up with
an ip of 192.168.88.1
and a netmask of 255.255.255.0
so you need to set some ip in the
same subnet to access the web gui.
Tag: Networking
Initial Setup of a MikroTik CRS309 Switch
Setting up a new MikroTik CRS309 turned out to be a bit more complicated than anticipated.
This post outlines my process of setting up the switch. The screenshots show the newer gui as I’m reconstructing this process after setting everything up.
Login to Admin Gui
The MikroTik manual states you need to use a windows program called winbox but I found out,
the switch has a web gui running on port 80, so thats what I used. The switch boots up with
an ip of 192.168.88.1
and a netmask of 255.255.255.0
so you need to set some ip in the
same subnet to access the web gui.
Tag: Router
Initial Setup of a MikroTik CRS309 Switch
Setting up a new MikroTik CRS309 turned out to be a bit more complicated than anticipated.
This post outlines my process of setting up the switch. The screenshots show the newer gui as I’m reconstructing this process after setting everything up.
Login to Admin Gui
The MikroTik manual states you need to use a windows program called winbox but I found out,
the switch has a web gui running on port 80, so thats what I used. The switch boots up with
an ip of 192.168.88.1
and a netmask of 255.255.255.0
so you need to set some ip in the
same subnet to access the web gui.
Tag: Routeros
Initial Setup of a MikroTik CRS309 Switch
Setting up a new MikroTik CRS309 turned out to be a bit more complicated than anticipated.
This post outlines my process of setting up the switch. The screenshots show the newer gui as I’m reconstructing this process after setting everything up.
Login to Admin Gui
The MikroTik manual states you need to use a windows program called winbox but I found out,
the switch has a web gui running on port 80, so thats what I used. The switch boots up with
an ip of 192.168.88.1
and a netmask of 255.255.255.0
so you need to set some ip in the
same subnet to access the web gui.
Tag: Switch
Initial Setup of a MikroTik CRS309 Switch
Setting up a new MikroTik CRS309 turned out to be a bit more complicated than anticipated.
This post outlines my process of setting up the switch. The screenshots show the newer gui as I’m reconstructing this process after setting everything up.
Login to Admin Gui
The MikroTik manual states you need to use a windows program called winbox but I found out,
the switch has a web gui running on port 80, so thats what I used. The switch boots up with
an ip of 192.168.88.1
and a netmask of 255.255.255.0
so you need to set some ip in the
same subnet to access the web gui.
Tag: Swos
Initial Setup of a MikroTik CRS309 Switch
Setting up a new MikroTik CRS309 turned out to be a bit more complicated than anticipated.
This post outlines my process of setting up the switch. The screenshots show the newer gui as I’m reconstructing this process after setting everything up.
Login to Admin Gui
The MikroTik manual states you need to use a windows program called winbox but I found out,
the switch has a web gui running on port 80, so thats what I used. The switch boots up with
an ip of 192.168.88.1
and a netmask of 255.255.255.0
so you need to set some ip in the
same subnet to access the web gui.
Tag: Debian
Proxmox on mirrored root zfs
why
I wanted my new proxmox to have mirrored boot drives for redundancy resons. I used somewhat large drives, meaning to pass them trough to a TrueNAS instance
how
Basically just install debian on a mirrored root zfs and then put proxmox on top
- put 2 drives into the system
- follow the Root on ZFS Guide by OpenZFS and set the drives up as a mirror.
- follow the Install Proxmox VE on Debian Bookworm Guide by Proxmox. As new Debian versions are released, new guides will probably be published but the basics likely remain the same
How to fix file permissions on debian
It just so happens I recently migrated a linux system from ext4 to btrfs and didn’t want to do a fresh install. So make a new partition, format it and copy the system over… Too bad I forgot to tell ‘cp’ to preserve the file permissions. I ended up with a system where every last file was owned by root and that obviously causes problem with everything that isn’t executed as root. And of course, the old partition was already gone - bad luck. Reinstalling is pretty much the only ‘solution’ I could find for this problem on the web. What follows are a few simple steps to resurrect a system to at least a runnable / good-enough state:
Tag: Mirror
Proxmox on mirrored root zfs
why
I wanted my new proxmox to have mirrored boot drives for redundancy resons. I used somewhat large drives, meaning to pass them trough to a TrueNAS instance
how
Basically just install debian on a mirrored root zfs and then put proxmox on top
- put 2 drives into the system
- follow the Root on ZFS Guide by OpenZFS and set the drives up as a mirror.
- follow the Install Proxmox VE on Debian Bookworm Guide by Proxmox. As new Debian versions are released, new guides will probably be published but the basics likely remain the same
Tag: Proxmox
Proxmox on mirrored root zfs
why
I wanted my new proxmox to have mirrored boot drives for redundancy resons. I used somewhat large drives, meaning to pass them trough to a TrueNAS instance
how
Basically just install debian on a mirrored root zfs and then put proxmox on top
- put 2 drives into the system
- follow the Root on ZFS Guide by OpenZFS and set the drives up as a mirror.
- follow the Install Proxmox VE on Debian Bookworm Guide by Proxmox. As new Debian versions are released, new guides will probably be published but the basics likely remain the same
Tag: Server
Proxmox on mirrored root zfs
why
I wanted my new proxmox to have mirrored boot drives for redundancy resons. I used somewhat large drives, meaning to pass them trough to a TrueNAS instance
how
Basically just install debian on a mirrored root zfs and then put proxmox on top
- put 2 drives into the system
- follow the Root on ZFS Guide by OpenZFS and set the drives up as a mirror.
- follow the Install Proxmox VE on Debian Bookworm Guide by Proxmox. As new Debian versions are released, new guides will probably be published but the basics likely remain the same
DPS-800GB A 1kW 12V 82A HP Server Power Supply
I recently snatched Four of these nifty 1kW power supplies. They were all used but still working. Well I kinda damaged one of them but more on that later. Here are a few shots of these things. They’re pretty compact for that amount of power.
Here they are removed from their cases and ready some customization. To disassemble, remove all screws on the outside of the case except for the Eight that hold the fans in, remove the lid, remove the Four screws in the corners of the PCB. Pop out the mains connector and push in the LED. Then you need to pull the mains plug out of the case so you can guide the mains wires through the opening where the mains plug was. It’s a bit fiddly but It’ll come out. Just be patient and try not to rip off the caps and whatever else is soldered to the mains plug. The fans are in their own little subassembly which can be gentrly pried away from the rest of the board after unplugging the fans. Please not that the PSU will shutdown a few seconds after powerup when either of the fans isn’t connected and running.
Tag: Zfs
Proxmox on mirrored root zfs
why
I wanted my new proxmox to have mirrored boot drives for redundancy resons. I used somewhat large drives, meaning to pass them trough to a TrueNAS instance
how
Basically just install debian on a mirrored root zfs and then put proxmox on top
- put 2 drives into the system
- follow the Root on ZFS Guide by OpenZFS and set the drives up as a mirror.
- follow the Install Proxmox VE on Debian Bookworm Guide by Proxmox. As new Debian versions are released, new guides will probably be published but the basics likely remain the same
Tag: Electronics
Buying Multimeters: The Uni-T UT61E
This is not a review, just a quick teardown of the Uni-T UT61E. If you expected a review, check out the in-depth review by Martin Lorton on youtube:
- Part 1: First Look and Basic Functions
- Part 2: Look at PC data logging software, UltraDMM software, mains measurement and more
- Part 3: A look inside at input protection / build quality.Test battery consumption and low warning
- Part 4: Torture test to see temperature stability and calibration with DMMCheck
And to balance things out, also check out what Dave Jones of the EEVBlog has to say about the Uni-T UT71E and UT61E in general.
DPS-800GB A 1kW 12V 82A HP Server Power Supply
I recently snatched Four of these nifty 1kW power supplies. They were all used but still working. Well I kinda damaged one of them but more on that later. Here are a few shots of these things. They’re pretty compact for that amount of power.
Here they are removed from their cases and ready some customization. To disassemble, remove all screws on the outside of the case except for the Eight that hold the fans in, remove the lid, remove the Four screws in the corners of the PCB. Pop out the mains connector and push in the LED. Then you need to pull the mains plug out of the case so you can guide the mains wires through the opening where the mains plug was. It’s a bit fiddly but It’ll come out. Just be patient and try not to rip off the caps and whatever else is soldered to the mains plug. The fans are in their own little subassembly which can be gentrly pried away from the rest of the board after unplugging the fans. Please not that the PSU will shutdown a few seconds after powerup when either of the fans isn’t connected and running.
Always check those cheap electronics parts for faults
Yes tons of stuff can be ordered for next to nothing from China but be sure to check the quality. So sometimes this stuff is just billig (cheap) and not günstig/preiswert (good value for money)…. Caveat Emptor.
Tag: Input
Buying Multimeters: The Uni-T UT61E
This is not a review, just a quick teardown of the Uni-T UT61E. If you expected a review, check out the in-depth review by Martin Lorton on youtube:
- Part 1: First Look and Basic Functions
- Part 2: Look at PC data logging software, UltraDMM software, mains measurement and more
- Part 3: A look inside at input protection / build quality.Test battery consumption and low warning
- Part 4: Torture test to see temperature stability and calibration with DMMCheck
And to balance things out, also check out what Dave Jones of the EEVBlog has to say about the Uni-T UT71E and UT61E in general.
Tag: Multimeter
Buying Multimeters: The Uni-T UT61E
This is not a review, just a quick teardown of the Uni-T UT61E. If you expected a review, check out the in-depth review by Martin Lorton on youtube:
- Part 1: First Look and Basic Functions
- Part 2: Look at PC data logging software, UltraDMM software, mains measurement and more
- Part 3: A look inside at input protection / build quality.Test battery consumption and low warning
- Part 4: Torture test to see temperature stability and calibration with DMMCheck
And to balance things out, also check out what Dave Jones of the EEVBlog has to say about the Uni-T UT71E and UT61E in general.
Tag: Protection
Buying Multimeters: The Uni-T UT61E
This is not a review, just a quick teardown of the Uni-T UT61E. If you expected a review, check out the in-depth review by Martin Lorton on youtube:
- Part 1: First Look and Basic Functions
- Part 2: Look at PC data logging software, UltraDMM software, mains measurement and more
- Part 3: A look inside at input protection / build quality.Test battery consumption and low warning
- Part 4: Torture test to see temperature stability and calibration with DMMCheck
And to balance things out, also check out what Dave Jones of the EEVBlog has to say about the Uni-T UT71E and UT61E in general.
Tag: Review
Buying Multimeters: The Uni-T UT61E
This is not a review, just a quick teardown of the Uni-T UT61E. If you expected a review, check out the in-depth review by Martin Lorton on youtube:
- Part 1: First Look and Basic Functions
- Part 2: Look at PC data logging software, UltraDMM software, mains measurement and more
- Part 3: A look inside at input protection / build quality.Test battery consumption and low warning
- Part 4: Torture test to see temperature stability and calibration with DMMCheck
And to balance things out, also check out what Dave Jones of the EEVBlog has to say about the Uni-T UT71E and UT61E in general.
Tag: Teardown
Buying Multimeters: The Uni-T UT61E
This is not a review, just a quick teardown of the Uni-T UT61E. If you expected a review, check out the in-depth review by Martin Lorton on youtube:
- Part 1: First Look and Basic Functions
- Part 2: Look at PC data logging software, UltraDMM software, mains measurement and more
- Part 3: A look inside at input protection / build quality.Test battery consumption and low warning
- Part 4: Torture test to see temperature stability and calibration with DMMCheck
And to balance things out, also check out what Dave Jones of the EEVBlog has to say about the Uni-T UT71E and UT61E in general.
Tag: Uni-T
Buying Multimeters: The Uni-T UT61E
This is not a review, just a quick teardown of the Uni-T UT61E. If you expected a review, check out the in-depth review by Martin Lorton on youtube:
- Part 1: First Look and Basic Functions
- Part 2: Look at PC data logging software, UltraDMM software, mains measurement and more
- Part 3: A look inside at input protection / build quality.Test battery consumption and low warning
- Part 4: Torture test to see temperature stability and calibration with DMMCheck
And to balance things out, also check out what Dave Jones of the EEVBlog has to say about the Uni-T UT71E and UT61E in general.
Tag: UT61E
Buying Multimeters: The Uni-T UT61E
This is not a review, just a quick teardown of the Uni-T UT61E. If you expected a review, check out the in-depth review by Martin Lorton on youtube:
- Part 1: First Look and Basic Functions
- Part 2: Look at PC data logging software, UltraDMM software, mains measurement and more
- Part 3: A look inside at input protection / build quality.Test battery consumption and low warning
- Part 4: Torture test to see temperature stability and calibration with DMMCheck
And to balance things out, also check out what Dave Jones of the EEVBlog has to say about the Uni-T UT71E and UT61E in general.
Tag: Voltmeter
Buying Multimeters: The Uni-T UT61E
This is not a review, just a quick teardown of the Uni-T UT61E. If you expected a review, check out the in-depth review by Martin Lorton on youtube:
- Part 1: First Look and Basic Functions
- Part 2: Look at PC data logging software, UltraDMM software, mains measurement and more
- Part 3: A look inside at input protection / build quality.Test battery consumption and low warning
- Part 4: Torture test to see temperature stability and calibration with DMMCheck
And to balance things out, also check out what Dave Jones of the EEVBlog has to say about the Uni-T UT71E and UT61E in general.
Tag: DPS-800GB
DPS-800GB A 1kW 12V 82A HP Server Power Supply
I recently snatched Four of these nifty 1kW power supplies. They were all used but still working. Well I kinda damaged one of them but more on that later. Here are a few shots of these things. They’re pretty compact for that amount of power.
Here they are removed from their cases and ready some customization. To disassemble, remove all screws on the outside of the case except for the Eight that hold the fans in, remove the lid, remove the Four screws in the corners of the PCB. Pop out the mains connector and push in the LED. Then you need to pull the mains plug out of the case so you can guide the mains wires through the opening where the mains plug was. It’s a bit fiddly but It’ll come out. Just be patient and try not to rip off the caps and whatever else is soldered to the mains plug. The fans are in their own little subassembly which can be gentrly pried away from the rest of the board after unplugging the fans. Please not that the PSU will shutdown a few seconds after powerup when either of the fans isn’t connected and running.
Tag: HP
DPS-800GB A 1kW 12V 82A HP Server Power Supply
I recently snatched Four of these nifty 1kW power supplies. They were all used but still working. Well I kinda damaged one of them but more on that later. Here are a few shots of these things. They’re pretty compact for that amount of power.
Here they are removed from their cases and ready some customization. To disassemble, remove all screws on the outside of the case except for the Eight that hold the fans in, remove the lid, remove the Four screws in the corners of the PCB. Pop out the mains connector and push in the LED. Then you need to pull the mains plug out of the case so you can guide the mains wires through the opening where the mains plug was. It’s a bit fiddly but It’ll come out. Just be patient and try not to rip off the caps and whatever else is soldered to the mains plug. The fans are in their own little subassembly which can be gentrly pried away from the rest of the board after unplugging the fans. Please not that the PSU will shutdown a few seconds after powerup when either of the fans isn’t connected and running.
Tag: Power
DPS-800GB A 1kW 12V 82A HP Server Power Supply
I recently snatched Four of these nifty 1kW power supplies. They were all used but still working. Well I kinda damaged one of them but more on that later. Here are a few shots of these things. They’re pretty compact for that amount of power.
Here they are removed from their cases and ready some customization. To disassemble, remove all screws on the outside of the case except for the Eight that hold the fans in, remove the lid, remove the Four screws in the corners of the PCB. Pop out the mains connector and push in the LED. Then you need to pull the mains plug out of the case so you can guide the mains wires through the opening where the mains plug was. It’s a bit fiddly but It’ll come out. Just be patient and try not to rip off the caps and whatever else is soldered to the mains plug. The fans are in their own little subassembly which can be gentrly pried away from the rest of the board after unplugging the fans. Please not that the PSU will shutdown a few seconds after powerup when either of the fans isn’t connected and running.
Tag: Supply
DPS-800GB A 1kW 12V 82A HP Server Power Supply
I recently snatched Four of these nifty 1kW power supplies. They were all used but still working. Well I kinda damaged one of them but more on that later. Here are a few shots of these things. They’re pretty compact for that amount of power.
Here they are removed from their cases and ready some customization. To disassemble, remove all screws on the outside of the case except for the Eight that hold the fans in, remove the lid, remove the Four screws in the corners of the PCB. Pop out the mains connector and push in the LED. Then you need to pull the mains plug out of the case so you can guide the mains wires through the opening where the mains plug was. It’s a bit fiddly but It’ll come out. Just be patient and try not to rip off the caps and whatever else is soldered to the mains plug. The fans are in their own little subassembly which can be gentrly pried away from the rest of the board after unplugging the fans. Please not that the PSU will shutdown a few seconds after powerup when either of the fans isn’t connected and running.
Tag: Cheap
Always check those cheap electronics parts for faults
Yes tons of stuff can be ordered for next to nothing from China but be sure to check the quality. So sometimes this stuff is just billig (cheap) and not günstig/preiswert (good value for money)…. Caveat Emptor.
Tag: China
Always check those cheap electronics parts for faults
Yes tons of stuff can be ordered for next to nothing from China but be sure to check the quality. So sometimes this stuff is just billig (cheap) and not günstig/preiswert (good value for money)…. Caveat Emptor.
Tag: Quality
Always check those cheap electronics parts for faults
Yes tons of stuff can be ordered for next to nothing from China but be sure to check the quality. So sometimes this stuff is just billig (cheap) and not günstig/preiswert (good value for money)…. Caveat Emptor.
Quality loss when saving as JPEG
So recently an article was linked on reddit about a bunch of photography myths. Number 7, called “Saving a JPEG multiple times degrades quality” which was apparently proven to be wrong disturbed me most and so I did a quick test on my own.
Further below are Three images. The Original, the one saved 30 times and an XOR of the first Two showing the differences. If you toggle between the first two you can see quite a difference in the color on the car and several other details. Running any more tests at lower than 100% quality is pretty much a waste of time once you see these results.
Tag: Bamboo
Building custom libs that are missing in maven central or jcenter
So you really need that java lib somebody made but they can’t be bothered to upload it to maven central or jcenter? Build it yourself and host it on your nexus repo. That’s what the script below does. Additionally, it includes a little xslt magic which allows to change the pom.xml in any way you want. Are they using snapshot versions? No problem, just insert your own (as shown in the script). Need to change a dependency? With some more xslt you can definitely do that too! You can run this script locally or on a build server (the script is using jenkins’ BUILD_NUMBER environment variable).
Tanuki Software's Java Service Wrapper and your Environment Variables
This post has been updated
Here’s an evil little gotcha I ran into when using the Tanuki Software Java Service Wrapper to run our company bamboo server. But first you need to know a little bit of context:
We had our bamboo agent running as root for a long time and about two months ago I changed that. I created a bamboo user and installed multiple agents. Each of these into its own folder along the lines of /home/bamboo/bamboo/java-agent-1
etc. Now in each of these is a wrapper script under bin/bamboo-agent.sh
. Reading up a bit on the wrapper I found out I can change the script and set the variable RUN_AS_USER=bamboo
and then simply create a symlink from /etc/init.d/java-agent-1
to /home/bamboo/bamboo/java-agent-1/bin/bamboo-agent.sh
and then start/stop the service just like any other normal unix service.
Tanuki Software's Java Service Wrapper and your Environment Variables
This post has been updated
Here’s an evil little gotcha I ran into when using the Tanuki Software Java Service Wrapper to run our company bamboo server. But first you need to know a little bit of context:
We had our bamboo agent running as root for a long time and about two months ago I changed that. I created a bamboo user and installed multiple agents. Each of these into its own folder along the lines of /home/bamboo/bamboo/java-agent-1
etc. Now in each of these is a wrapper script under bin/bamboo-agent.sh
. Reading up a bit on the wrapper I found out I can change the script and set the variable RUN_AS_USER=bamboo
and then simply create a symlink from /etc/init.d/java-agent-1
to /home/bamboo/bamboo/java-agent-1/bin/bamboo-agent.sh
and then start/stop the service just like any other normal unix service.
Tag: Build
Building custom libs that are missing in maven central or jcenter
So you really need that java lib somebody made but they can’t be bothered to upload it to maven central or jcenter? Build it yourself and host it on your nexus repo. That’s what the script below does. Additionally, it includes a little xslt magic which allows to change the pom.xml in any way you want. Are they using snapshot versions? No problem, just insert your own (as shown in the script). Need to change a dependency? With some more xslt you can definitely do that too! You can run this script locally or on a build server (the script is using jenkins’ BUILD_NUMBER environment variable).
Reuse repository definitions in gradle
A project I’m working on has accumulated a bunch of repositories we need for the build.gradle script and for the build itself. I don’t want to keep everything twice so I thought some re-use is in order. Thanks to the excellent gradle javadocs and api design, it was easy to accomplish. The end result is this:
// add your repositories here
buildscript {
repositories {
mavenLocal()
jcenter()
maven {
credentials {
username 'user'
password 'pass'
}
url 'https://example.com/maven2'
}
}
// re-use repositories from buildscript
buildscript.repositories.each { repositories.add(it) }
To gradle daemon or not to gradle daemon?
The gradle daemon speeds up builds quite a bit and you do want to have it running on your local machine but not on the build server, that one should always re-build from scratch. This is actually quite easy to accomplish.
In your ‘~/.bashrc’ add the following export so your local machine runs all builds with the gradle daemon:
export GRADLE\_OPTS="-Dorg.gradle.daemon=true"
Since, by default, the gradle daemon is not started, it will not be used on your build server.
Using WSDLToJava with Gradle
I currently need to generate some java sources from a wsdl with Gradle. In Maven, I’ve done this using the cxf-codegen-plugin but I want to avoid using anything that’s maven in my new build. And there’s several resources out there who will either tell you to call WSDLToJava via cxf’s ant task or directly. The ‘direct’ way still leaves you with several possibilities like calling ProcessBuilder from your Gradle Task or using a JavaExec Task. I went for the last.
Using WSDLToJava with Gradle
I currently need to generate some java sources from a wsdl with Gradle. In Maven, I’ve done this using the cxf-codegen-plugin but I want to avoid using anything that’s maven in my new build. And there’s several resources out there who will either tell you to call WSDLToJava via cxf’s ant task or directly. The ‘direct’ way still leaves you with several possibilities like calling ProcessBuilder from your Gradle Task or using a JavaExec Task. I went for the last.
Anatomy of a Gradle Task: Task Dependencies
Here’s another little “gotcha” on gradle tasks. When they’re triggered. I’ve seen quite a few posts on stackoverflow along the lines of “How can I call a gradle task”. You can’t (Well ok you could but you don’t want to). It is executed as a dependency of another task.
Going back to that little copy task I wanted to execute, I needed it to run roughly somewhere after the clean task and before jar task. I ended up making my copy task dependent of compile. That didn’t work so I tried compileJava and classes and… only to later realize that the dependency was in the wrong direction: My task depends on one that is being executed but that doesn’t trigger my task, it merely states that my task, should it be executed, can only be run after some other task.
Anatomy of a Gradle Task: Task Dependencies
Here’s another little “gotcha” on gradle tasks. When they’re triggered. I’ve seen quite a few posts on stackoverflow along the lines of “How can I call a gradle task”. You can’t (Well ok you could but you don’t want to). It is executed as a dependency of another task.
Going back to that little copy task I wanted to execute, I needed it to run roughly somewhere after the clean task and before jar task. I ended up making my copy task dependent of compile. That didn’t work so I tried compileJava and classes and… only to later realize that the dependency was in the wrong direction: My task depends on one that is being executed but that doesn’t trigger my task, it merely states that my task, should it be executed, can only be run after some other task.
Anatomy of a Gradle Task: Execution Time
So I’m trying out gradle in the hope of being able to simplify and streamline an existing maven build which is quite large and inconsistent. And I just spent some time to figure out a few gotcha about gradle tasks and this post is to help others understand, because I think the gradle docs aren’t clear enough on this point
What I wanted to achieve is, to copy all *.properties files in src/main/java to the resources-output directory so they are included in the jar, because javac won’t pick them up since they’re no java source files. What happened was that, depending on what approach I tried, sometimes the copy worked sometimes it didn’t and I couldn’t understand why.
Anatomy of a Gradle Task: Execution Time
So I’m trying out gradle in the hope of being able to simplify and streamline an existing maven build which is quite large and inconsistent. And I just spent some time to figure out a few gotcha about gradle tasks and this post is to help others understand, because I think the gradle docs aren’t clear enough on this point
What I wanted to achieve is, to copy all *.properties files in src/main/java to the resources-output directory so they are included in the jar, because javac won’t pick them up since they’re no java source files. What happened was that, depending on what approach I tried, sometimes the copy worked sometimes it didn’t and I couldn’t understand why.
Tag: Github
Building custom libs that are missing in maven central or jcenter
So you really need that java lib somebody made but they can’t be bothered to upload it to maven central or jcenter? Build it yourself and host it on your nexus repo. That’s what the script below does. Additionally, it includes a little xslt magic which allows to change the pom.xml in any way you want. Are they using snapshot versions? No problem, just insert your own (as shown in the script). Need to change a dependency? With some more xslt you can definitely do that too! You can run this script locally or on a build server (the script is using jenkins’ BUILD_NUMBER environment variable).
Tag: Gradle
Building custom libs that are missing in maven central or jcenter
So you really need that java lib somebody made but they can’t be bothered to upload it to maven central or jcenter? Build it yourself and host it on your nexus repo. That’s what the script below does. Additionally, it includes a little xslt magic which allows to change the pom.xml in any way you want. Are they using snapshot versions? No problem, just insert your own (as shown in the script). Need to change a dependency? With some more xslt you can definitely do that too! You can run this script locally or on a build server (the script is using jenkins’ BUILD_NUMBER environment variable).
Reuse repository definitions in gradle
A project I’m working on has accumulated a bunch of repositories we need for the build.gradle script and for the build itself. I don’t want to keep everything twice so I thought some re-use is in order. Thanks to the excellent gradle javadocs and api design, it was easy to accomplish. The end result is this:
// add your repositories here
buildscript {
repositories {
mavenLocal()
jcenter()
maven {
credentials {
username 'user'
password 'pass'
}
url 'https://example.com/maven2'
}
}
// re-use repositories from buildscript
buildscript.repositories.each { repositories.add(it) }
To gradle daemon or not to gradle daemon?
The gradle daemon speeds up builds quite a bit and you do want to have it running on your local machine but not on the build server, that one should always re-build from scratch. This is actually quite easy to accomplish.
In your ‘~/.bashrc’ add the following export so your local machine runs all builds with the gradle daemon:
export GRADLE\_OPTS="-Dorg.gradle.daemon=true"
Since, by default, the gradle daemon is not started, it will not be used on your build server.
Using WSDLToJava with Gradle
I currently need to generate some java sources from a wsdl with Gradle. In Maven, I’ve done this using the cxf-codegen-plugin but I want to avoid using anything that’s maven in my new build. And there’s several resources out there who will either tell you to call WSDLToJava via cxf’s ant task or directly. The ‘direct’ way still leaves you with several possibilities like calling ProcessBuilder from your Gradle Task or using a JavaExec Task. I went for the last.
Using WSDLToJava with Gradle
I currently need to generate some java sources from a wsdl with Gradle. In Maven, I’ve done this using the cxf-codegen-plugin but I want to avoid using anything that’s maven in my new build. And there’s several resources out there who will either tell you to call WSDLToJava via cxf’s ant task or directly. The ‘direct’ way still leaves you with several possibilities like calling ProcessBuilder from your Gradle Task or using a JavaExec Task. I went for the last.
Anatomy of a Gradle Task: Task Dependencies
Here’s another little “gotcha” on gradle tasks. When they’re triggered. I’ve seen quite a few posts on stackoverflow along the lines of “How can I call a gradle task”. You can’t (Well ok you could but you don’t want to). It is executed as a dependency of another task.
Going back to that little copy task I wanted to execute, I needed it to run roughly somewhere after the clean task and before jar task. I ended up making my copy task dependent of compile. That didn’t work so I tried compileJava and classes and… only to later realize that the dependency was in the wrong direction: My task depends on one that is being executed but that doesn’t trigger my task, it merely states that my task, should it be executed, can only be run after some other task.
Anatomy of a Gradle Task: Task Dependencies
Here’s another little “gotcha” on gradle tasks. When they’re triggered. I’ve seen quite a few posts on stackoverflow along the lines of “How can I call a gradle task”. You can’t (Well ok you could but you don’t want to). It is executed as a dependency of another task.
Going back to that little copy task I wanted to execute, I needed it to run roughly somewhere after the clean task and before jar task. I ended up making my copy task dependent of compile. That didn’t work so I tried compileJava and classes and… only to later realize that the dependency was in the wrong direction: My task depends on one that is being executed but that doesn’t trigger my task, it merely states that my task, should it be executed, can only be run after some other task.
Anatomy of a Gradle Task: Execution Time
So I’m trying out gradle in the hope of being able to simplify and streamline an existing maven build which is quite large and inconsistent. And I just spent some time to figure out a few gotcha about gradle tasks and this post is to help others understand, because I think the gradle docs aren’t clear enough on this point
What I wanted to achieve is, to copy all *.properties files in src/main/java to the resources-output directory so they are included in the jar, because javac won’t pick them up since they’re no java source files. What happened was that, depending on what approach I tried, sometimes the copy worked sometimes it didn’t and I couldn’t understand why.
Anatomy of a Gradle Task: Execution Time
So I’m trying out gradle in the hope of being able to simplify and streamline an existing maven build which is quite large and inconsistent. And I just spent some time to figure out a few gotcha about gradle tasks and this post is to help others understand, because I think the gradle docs aren’t clear enough on this point
What I wanted to achieve is, to copy all *.properties files in src/main/java to the resources-output directory so they are included in the jar, because javac won’t pick them up since they’re no java source files. What happened was that, depending on what approach I tried, sometimes the copy worked sometimes it didn’t and I couldn’t understand why.
Tag: Java
Building custom libs that are missing in maven central or jcenter
So you really need that java lib somebody made but they can’t be bothered to upload it to maven central or jcenter? Build it yourself and host it on your nexus repo. That’s what the script below does. Additionally, it includes a little xslt magic which allows to change the pom.xml in any way you want. Are they using snapshot versions? No problem, just insert your own (as shown in the script). Need to change a dependency? With some more xslt you can definitely do that too! You can run this script locally or on a build server (the script is using jenkins’ BUILD_NUMBER environment variable).
Dump all (most?) JMX Beans via Jolokia using just the shell and a bit of json formatting
It seems jolokia doesn’t support dumping everything with just one command.
So here’s a really hacky quick and dirty way to get all information jolokia can access:
for name in $(curl --silent http://dwtest:10002/search/*:* | python -m json.tool | grep '"value":' -A9999 | tail -n +2 | head -n -2 | sed 's/ /%20/g' | cut -d'"' -f2);
do curl --silent "http://dwtest:10002/read/$name" | python -m json.tool;
done
Once you’ve gotten this far, piping the information into a file or another tool is trivial.
Migrating Sonar to a different host
Just a few notes on how to move your sonar installation to another machine without loosing any of your config and/or history. Some of that include the normal installation steps too. This list is specific to debian.
Add the sonar debian repo to your machine (/etc/apt/sources.list.d/sonar)
deb http://downloads.sourceforge.net/project/sonar-pkg/deb binary/
Update and install sonar (sonar pkg is not signed):
aptitude update && aptitude install sonar
Install postgresql and set up an account for your sonar:
To gradle daemon or not to gradle daemon?
The gradle daemon speeds up builds quite a bit and you do want to have it running on your local machine but not on the build server, that one should always re-build from scratch. This is actually quite easy to accomplish.
In your ‘~/.bashrc’ add the following export so your local machine runs all builds with the gradle daemon:
export GRADLE\_OPTS="-Dorg.gradle.daemon=true"
Since, by default, the gradle daemon is not started, it will not be used on your build server.
Tanuki Software's Java Service Wrapper and your Environment Variables
This post has been updated
Here’s an evil little gotcha I ran into when using the Tanuki Software Java Service Wrapper to run our company bamboo server. But first you need to know a little bit of context:
We had our bamboo agent running as root for a long time and about two months ago I changed that. I created a bamboo user and installed multiple agents. Each of these into its own folder along the lines of /home/bamboo/bamboo/java-agent-1
etc. Now in each of these is a wrapper script under bin/bamboo-agent.sh
. Reading up a bit on the wrapper I found out I can change the script and set the variable RUN_AS_USER=bamboo
and then simply create a symlink from /etc/init.d/java-agent-1
to /home/bamboo/bamboo/java-agent-1/bin/bamboo-agent.sh
and then start/stop the service just like any other normal unix service.
Tanuki Software's Java Service Wrapper and your Environment Variables
This post has been updated
Here’s an evil little gotcha I ran into when using the Tanuki Software Java Service Wrapper to run our company bamboo server. But first you need to know a little bit of context:
We had our bamboo agent running as root for a long time and about two months ago I changed that. I created a bamboo user and installed multiple agents. Each of these into its own folder along the lines of /home/bamboo/bamboo/java-agent-1
etc. Now in each of these is a wrapper script under bin/bamboo-agent.sh
. Reading up a bit on the wrapper I found out I can change the script and set the variable RUN_AS_USER=bamboo
and then simply create a symlink from /etc/init.d/java-agent-1
to /home/bamboo/bamboo/java-agent-1/bin/bamboo-agent.sh
and then start/stop the service just like any other normal unix service.
Using WSDLToJava with Gradle
I currently need to generate some java sources from a wsdl with Gradle. In Maven, I’ve done this using the cxf-codegen-plugin but I want to avoid using anything that’s maven in my new build. And there’s several resources out there who will either tell you to call WSDLToJava via cxf’s ant task or directly. The ‘direct’ way still leaves you with several possibilities like calling ProcessBuilder from your Gradle Task or using a JavaExec Task. I went for the last.
Using WSDLToJava with Gradle
I currently need to generate some java sources from a wsdl with Gradle. In Maven, I’ve done this using the cxf-codegen-plugin but I want to avoid using anything that’s maven in my new build. And there’s several resources out there who will either tell you to call WSDLToJava via cxf’s ant task or directly. The ‘direct’ way still leaves you with several possibilities like calling ProcessBuilder from your Gradle Task or using a JavaExec Task. I went for the last.
Anatomy of a Gradle Task: Task Dependencies
Here’s another little “gotcha” on gradle tasks. When they’re triggered. I’ve seen quite a few posts on stackoverflow along the lines of “How can I call a gradle task”. You can’t (Well ok you could but you don’t want to). It is executed as a dependency of another task.
Going back to that little copy task I wanted to execute, I needed it to run roughly somewhere after the clean task and before jar task. I ended up making my copy task dependent of compile. That didn’t work so I tried compileJava and classes and… only to later realize that the dependency was in the wrong direction: My task depends on one that is being executed but that doesn’t trigger my task, it merely states that my task, should it be executed, can only be run after some other task.
Anatomy of a Gradle Task: Task Dependencies
Here’s another little “gotcha” on gradle tasks. When they’re triggered. I’ve seen quite a few posts on stackoverflow along the lines of “How can I call a gradle task”. You can’t (Well ok you could but you don’t want to). It is executed as a dependency of another task.
Going back to that little copy task I wanted to execute, I needed it to run roughly somewhere after the clean task and before jar task. I ended up making my copy task dependent of compile. That didn’t work so I tried compileJava and classes and… only to later realize that the dependency was in the wrong direction: My task depends on one that is being executed but that doesn’t trigger my task, it merely states that my task, should it be executed, can only be run after some other task.
Anatomy of a Gradle Task: Execution Time
So I’m trying out gradle in the hope of being able to simplify and streamline an existing maven build which is quite large and inconsistent. And I just spent some time to figure out a few gotcha about gradle tasks and this post is to help others understand, because I think the gradle docs aren’t clear enough on this point
What I wanted to achieve is, to copy all *.properties files in src/main/java to the resources-output directory so they are included in the jar, because javac won’t pick them up since they’re no java source files. What happened was that, depending on what approach I tried, sometimes the copy worked sometimes it didn’t and I couldn’t understand why.
Anatomy of a Gradle Task: Execution Time
So I’m trying out gradle in the hope of being able to simplify and streamline an existing maven build which is quite large and inconsistent. And I just spent some time to figure out a few gotcha about gradle tasks and this post is to help others understand, because I think the gradle docs aren’t clear enough on this point
What I wanted to achieve is, to copy all *.properties files in src/main/java to the resources-output directory so they are included in the jar, because javac won’t pick them up since they’re no java source files. What happened was that, depending on what approach I tried, sometimes the copy worked sometimes it didn’t and I couldn’t understand why.
Automatically switch your mvn settings
I have one primary development notebook and take that with me wherever I go. Now in a company setting you usually have proxies and whatnot. And I think it’s an official best practice to roll your own company or departement maven proxy/cache.
So dependent on where you are, you might need a different maven ~/.m2/settings.xml file. Here’s a very simple shell function you can add to your ~/.bashrc
function mvn {
MVN="$(which mvn)"
if [ -n "$(ifconfig eth0 | grep YOUR-WORK-IP-HERE)" ]; then
echo ">>> Running mvn whith work config"
${MVN} -gs ${HOME}/.m2/settings-work.xml $*
else
echo ">>> Running mvn with vanilla config"
${MVN} $*
fi
}
This just checks for the ip of eth0 and calls mvn with a special settings.xml. In all other cases mvn is run with the vanilla config (or none, since the settings.xml is optional).
Automatically switch your mvn settings
I have one primary development notebook and take that with me wherever I go. Now in a company setting you usually have proxies and whatnot. And I think it’s an official best practice to roll your own company or departement maven proxy/cache.
So dependent on where you are, you might need a different maven ~/.m2/settings.xml file. Here’s a very simple shell function you can add to your ~/.bashrc
function mvn {
MVN="$(which mvn)"
if [ -n "$(ifconfig eth0 | grep YOUR-WORK-IP-HERE)" ]; then
echo ">>> Running mvn whith work config"
${MVN} -gs ${HOME}/.m2/settings-work.xml $*
else
echo ">>> Running mvn with vanilla config"
${MVN} $*
fi
}
This just checks for the ip of eth0 and calls mvn with a special settings.xml. In all other cases mvn is run with the vanilla config (or none, since the settings.xml is optional).
Accessing Request Parameters in a Grails Domain Class
It’s not exactly elegant to work with request parameters in a domain class but it was necessary. I have a bunch of domain classes with “asMap” methods where they render themself into a map and cascade to other domain objects as needed. In the controller, the resulting map is given to the renderer and we get a nice json response.
So now I’ve changed some fields and in order to stay backwards compatible, I created a new apiKey (a parameter needed for all calls to my app) that distinguishes old and new clients.
Accessing Request Parameters in a Grails Domain Class
It’s not exactly elegant to work with request parameters in a domain class but it was necessary. I have a bunch of domain classes with “asMap” methods where they render themself into a map and cascade to other domain objects as needed. In the controller, the resulting map is given to the renderer and we get a nice json response.
So now I’ve changed some fields and in order to stay backwards compatible, I created a new apiKey (a parameter needed for all calls to my app) that distinguishes old and new clients.
The simplest backup solution ever...
… not for everyone maybe, but for me as a linux/java/groovy person it is!
you’ll need
- java
- groovy
- rsync
- ssh
- cron
I want to regularly copy files from a bunch of paths from several remote hosts to the localhost. I know nothing about arrays etc in bash to configure this stuff in a simple way but the solution I’ve come up with is simple and elegant:
I wrote a little groovy script that generates the shell commands to be executed. the output of the groovy script is piped into bash, which executes the commands. That call of the groovy script with the piping to the bash is in a little shell script which is called from cron.
Tag: Jenkins
Building custom libs that are missing in maven central or jcenter
So you really need that java lib somebody made but they can’t be bothered to upload it to maven central or jcenter? Build it yourself and host it on your nexus repo. That’s what the script below does. Additionally, it includes a little xslt magic which allows to change the pom.xml in any way you want. Are they using snapshot versions? No problem, just insert your own (as shown in the script). Need to change a dependency? With some more xslt you can definitely do that too! You can run this script locally or on a build server (the script is using jenkins’ BUILD_NUMBER environment variable).
Tag: Maven
Building custom libs that are missing in maven central or jcenter
So you really need that java lib somebody made but they can’t be bothered to upload it to maven central or jcenter? Build it yourself and host it on your nexus repo. That’s what the script below does. Additionally, it includes a little xslt magic which allows to change the pom.xml in any way you want. Are they using snapshot versions? No problem, just insert your own (as shown in the script). Need to change a dependency? With some more xslt you can definitely do that too! You can run this script locally or on a build server (the script is using jenkins’ BUILD_NUMBER environment variable).
Reuse repository definitions in gradle
A project I’m working on has accumulated a bunch of repositories we need for the build.gradle script and for the build itself. I don’t want to keep everything twice so I thought some re-use is in order. Thanks to the excellent gradle javadocs and api design, it was easy to accomplish. The end result is this:
// add your repositories here
buildscript {
repositories {
mavenLocal()
jcenter()
maven {
credentials {
username 'user'
password 'pass'
}
url 'https://example.com/maven2'
}
}
// re-use repositories from buildscript
buildscript.repositories.each { repositories.add(it) }
Using WSDLToJava with Gradle
I currently need to generate some java sources from a wsdl with Gradle. In Maven, I’ve done this using the cxf-codegen-plugin but I want to avoid using anything that’s maven in my new build. And there’s several resources out there who will either tell you to call WSDLToJava via cxf’s ant task or directly. The ‘direct’ way still leaves you with several possibilities like calling ProcessBuilder from your Gradle Task or using a JavaExec Task. I went for the last.
Using WSDLToJava with Gradle
I currently need to generate some java sources from a wsdl with Gradle. In Maven, I’ve done this using the cxf-codegen-plugin but I want to avoid using anything that’s maven in my new build. And there’s several resources out there who will either tell you to call WSDLToJava via cxf’s ant task or directly. The ‘direct’ way still leaves you with several possibilities like calling ProcessBuilder from your Gradle Task or using a JavaExec Task. I went for the last.
Anatomy of a Gradle Task: Task Dependencies
Here’s another little “gotcha” on gradle tasks. When they’re triggered. I’ve seen quite a few posts on stackoverflow along the lines of “How can I call a gradle task”. You can’t (Well ok you could but you don’t want to). It is executed as a dependency of another task.
Going back to that little copy task I wanted to execute, I needed it to run roughly somewhere after the clean task and before jar task. I ended up making my copy task dependent of compile. That didn’t work so I tried compileJava and classes and… only to later realize that the dependency was in the wrong direction: My task depends on one that is being executed but that doesn’t trigger my task, it merely states that my task, should it be executed, can only be run after some other task.
Anatomy of a Gradle Task: Task Dependencies
Here’s another little “gotcha” on gradle tasks. When they’re triggered. I’ve seen quite a few posts on stackoverflow along the lines of “How can I call a gradle task”. You can’t (Well ok you could but you don’t want to). It is executed as a dependency of another task.
Going back to that little copy task I wanted to execute, I needed it to run roughly somewhere after the clean task and before jar task. I ended up making my copy task dependent of compile. That didn’t work so I tried compileJava and classes and… only to later realize that the dependency was in the wrong direction: My task depends on one that is being executed but that doesn’t trigger my task, it merely states that my task, should it be executed, can only be run after some other task.
Anatomy of a Gradle Task: Execution Time
So I’m trying out gradle in the hope of being able to simplify and streamline an existing maven build which is quite large and inconsistent. And I just spent some time to figure out a few gotcha about gradle tasks and this post is to help others understand, because I think the gradle docs aren’t clear enough on this point
What I wanted to achieve is, to copy all *.properties files in src/main/java to the resources-output directory so they are included in the jar, because javac won’t pick them up since they’re no java source files. What happened was that, depending on what approach I tried, sometimes the copy worked sometimes it didn’t and I couldn’t understand why.
Anatomy of a Gradle Task: Execution Time
So I’m trying out gradle in the hope of being able to simplify and streamline an existing maven build which is quite large and inconsistent. And I just spent some time to figure out a few gotcha about gradle tasks and this post is to help others understand, because I think the gradle docs aren’t clear enough on this point
What I wanted to achieve is, to copy all *.properties files in src/main/java to the resources-output directory so they are included in the jar, because javac won’t pick them up since they’re no java source files. What happened was that, depending on what approach I tried, sometimes the copy worked sometimes it didn’t and I couldn’t understand why.
Automatically switch your mvn settings
I have one primary development notebook and take that with me wherever I go. Now in a company setting you usually have proxies and whatnot. And I think it’s an official best practice to roll your own company or departement maven proxy/cache.
So dependent on where you are, you might need a different maven ~/.m2/settings.xml file. Here’s a very simple shell function you can add to your ~/.bashrc
function mvn {
MVN="$(which mvn)"
if [ -n "$(ifconfig eth0 | grep YOUR-WORK-IP-HERE)" ]; then
echo ">>> Running mvn whith work config"
${MVN} -gs ${HOME}/.m2/settings-work.xml $*
else
echo ">>> Running mvn with vanilla config"
${MVN} $*
fi
}
This just checks for the ip of eth0 and calls mvn with a special settings.xml. In all other cases mvn is run with the vanilla config (or none, since the settings.xml is optional).
Automatically switch your mvn settings
I have one primary development notebook and take that with me wherever I go. Now in a company setting you usually have proxies and whatnot. And I think it’s an official best practice to roll your own company or departement maven proxy/cache.
So dependent on where you are, you might need a different maven ~/.m2/settings.xml file. Here’s a very simple shell function you can add to your ~/.bashrc
function mvn {
MVN="$(which mvn)"
if [ -n "$(ifconfig eth0 | grep YOUR-WORK-IP-HERE)" ]; then
echo ">>> Running mvn whith work config"
${MVN} -gs ${HOME}/.m2/settings-work.xml $*
else
echo ">>> Running mvn with vanilla config"
${MVN} $*
fi
}
This just checks for the ip of eth0 and calls mvn with a special settings.xml. In all other cases mvn is run with the vanilla config (or none, since the settings.xml is optional).
Tag: Pom
Building custom libs that are missing in maven central or jcenter
So you really need that java lib somebody made but they can’t be bothered to upload it to maven central or jcenter? Build it yourself and host it on your nexus repo. That’s what the script below does. Additionally, it includes a little xslt magic which allows to change the pom.xml in any way you want. Are they using snapshot versions? No problem, just insert your own (as shown in the script). Need to change a dependency? With some more xslt you can definitely do that too! You can run this script locally or on a build server (the script is using jenkins’ BUILD_NUMBER environment variable).
Tag: Xsl
Building custom libs that are missing in maven central or jcenter
So you really need that java lib somebody made but they can’t be bothered to upload it to maven central or jcenter? Build it yourself and host it on your nexus repo. That’s what the script below does. Additionally, it includes a little xslt magic which allows to change the pom.xml in any way you want. Are they using snapshot versions? No problem, just insert your own (as shown in the script). Need to change a dependency? With some more xslt you can definitely do that too! You can run this script locally or on a build server (the script is using jenkins’ BUILD_NUMBER environment variable).
Tag: Xslt
Building custom libs that are missing in maven central or jcenter
So you really need that java lib somebody made but they can’t be bothered to upload it to maven central or jcenter? Build it yourself and host it on your nexus repo. That’s what the script below does. Additionally, it includes a little xslt magic which allows to change the pom.xml in any way you want. Are they using snapshot versions? No problem, just insert your own (as shown in the script). Need to change a dependency? With some more xslt you can definitely do that too! You can run this script locally or on a build server (the script is using jenkins’ BUILD_NUMBER environment variable).
Tag: Xsltproc
Building custom libs that are missing in maven central or jcenter
So you really need that java lib somebody made but they can’t be bothered to upload it to maven central or jcenter? Build it yourself and host it on your nexus repo. That’s what the script below does. Additionally, it includes a little xslt magic which allows to change the pom.xml in any way you want. Are they using snapshot versions? No problem, just insert your own (as shown in the script). Need to change a dependency? With some more xslt you can definitely do that too! You can run this script locally or on a build server (the script is using jenkins’ BUILD_NUMBER environment variable).
Tag: Bintray
Reuse repository definitions in gradle
A project I’m working on has accumulated a bunch of repositories we need for the build.gradle script and for the build itself. I don’t want to keep everything twice so I thought some re-use is in order. Thanks to the excellent gradle javadocs and api design, it was easy to accomplish. The end result is this:
// add your repositories here
buildscript {
repositories {
mavenLocal()
jcenter()
maven {
credentials {
username 'user'
password 'pass'
}
url 'https://example.com/maven2'
}
}
// re-use repositories from buildscript
buildscript.repositories.each { repositories.add(it) }
Tag: Dependencies
Reuse repository definitions in gradle
A project I’m working on has accumulated a bunch of repositories we need for the build.gradle script and for the build itself. I don’t want to keep everything twice so I thought some re-use is in order. Thanks to the excellent gradle javadocs and api design, it was easy to accomplish. The end result is this:
// add your repositories here
buildscript {
repositories {
mavenLocal()
jcenter()
maven {
credentials {
username 'user'
password 'pass'
}
url 'https://example.com/maven2'
}
}
// re-use repositories from buildscript
buildscript.repositories.each { repositories.add(it) }
Tag: Groovy
Reuse repository definitions in gradle
A project I’m working on has accumulated a bunch of repositories we need for the build.gradle script and for the build itself. I don’t want to keep everything twice so I thought some re-use is in order. Thanks to the excellent gradle javadocs and api design, it was easy to accomplish. The end result is this:
// add your repositories here
buildscript {
repositories {
mavenLocal()
jcenter()
maven {
credentials {
username 'user'
password 'pass'
}
url 'https://example.com/maven2'
}
}
// re-use repositories from buildscript
buildscript.repositories.each { repositories.add(it) }
Executing shell commands from groovy
pre { border: 1px solid #EEE; }
Sometimes you want to run a shell command generated from groovy or just want to achieve something that’s faster/simpler in the shell in comparison to doing it with groovy. One such case would be to get the number of git commits of a repo. The command for this is fairly simple:
$ git log --oneline | wc -l
179
Running shell commands from groovy is really easy too. Make a list and call ’execute()’ on it - how awesome is that?
Using WSDLToJava with Gradle
I currently need to generate some java sources from a wsdl with Gradle. In Maven, I’ve done this using the cxf-codegen-plugin but I want to avoid using anything that’s maven in my new build. And there’s several resources out there who will either tell you to call WSDLToJava via cxf’s ant task or directly. The ‘direct’ way still leaves you with several possibilities like calling ProcessBuilder from your Gradle Task or using a JavaExec Task. I went for the last.
Using WSDLToJava with Gradle
I currently need to generate some java sources from a wsdl with Gradle. In Maven, I’ve done this using the cxf-codegen-plugin but I want to avoid using anything that’s maven in my new build. And there’s several resources out there who will either tell you to call WSDLToJava via cxf’s ant task or directly. The ‘direct’ way still leaves you with several possibilities like calling ProcessBuilder from your Gradle Task or using a JavaExec Task. I went for the last.
Anatomy of a Gradle Task: Task Dependencies
Here’s another little “gotcha” on gradle tasks. When they’re triggered. I’ve seen quite a few posts on stackoverflow along the lines of “How can I call a gradle task”. You can’t (Well ok you could but you don’t want to). It is executed as a dependency of another task.
Going back to that little copy task I wanted to execute, I needed it to run roughly somewhere after the clean task and before jar task. I ended up making my copy task dependent of compile. That didn’t work so I tried compileJava and classes and… only to later realize that the dependency was in the wrong direction: My task depends on one that is being executed but that doesn’t trigger my task, it merely states that my task, should it be executed, can only be run after some other task.
Anatomy of a Gradle Task: Task Dependencies
Here’s another little “gotcha” on gradle tasks. When they’re triggered. I’ve seen quite a few posts on stackoverflow along the lines of “How can I call a gradle task”. You can’t (Well ok you could but you don’t want to). It is executed as a dependency of another task.
Going back to that little copy task I wanted to execute, I needed it to run roughly somewhere after the clean task and before jar task. I ended up making my copy task dependent of compile. That didn’t work so I tried compileJava and classes and… only to later realize that the dependency was in the wrong direction: My task depends on one that is being executed but that doesn’t trigger my task, it merely states that my task, should it be executed, can only be run after some other task.
Anatomy of a Gradle Task: Execution Time
So I’m trying out gradle in the hope of being able to simplify and streamline an existing maven build which is quite large and inconsistent. And I just spent some time to figure out a few gotcha about gradle tasks and this post is to help others understand, because I think the gradle docs aren’t clear enough on this point
What I wanted to achieve is, to copy all *.properties files in src/main/java to the resources-output directory so they are included in the jar, because javac won’t pick them up since they’re no java source files. What happened was that, depending on what approach I tried, sometimes the copy worked sometimes it didn’t and I couldn’t understand why.
Anatomy of a Gradle Task: Execution Time
So I’m trying out gradle in the hope of being able to simplify and streamline an existing maven build which is quite large and inconsistent. And I just spent some time to figure out a few gotcha about gradle tasks and this post is to help others understand, because I think the gradle docs aren’t clear enough on this point
What I wanted to achieve is, to copy all *.properties files in src/main/java to the resources-output directory so they are included in the jar, because javac won’t pick them up since they’re no java source files. What happened was that, depending on what approach I tried, sometimes the copy worked sometimes it didn’t and I couldn’t understand why.
Accessing Request Parameters in a Grails Domain Class
It’s not exactly elegant to work with request parameters in a domain class but it was necessary. I have a bunch of domain classes with “asMap” methods where they render themself into a map and cascade to other domain objects as needed. In the controller, the resulting map is given to the renderer and we get a nice json response.
So now I’ve changed some fields and in order to stay backwards compatible, I created a new apiKey (a parameter needed for all calls to my app) that distinguishes old and new clients.
Accessing Request Parameters in a Grails Domain Class
It’s not exactly elegant to work with request parameters in a domain class but it was necessary. I have a bunch of domain classes with “asMap” methods where they render themself into a map and cascade to other domain objects as needed. In the controller, the resulting map is given to the renderer and we get a nice json response.
So now I’ve changed some fields and in order to stay backwards compatible, I created a new apiKey (a parameter needed for all calls to my app) that distinguishes old and new clients.
The simplest backup solution ever...
… not for everyone maybe, but for me as a linux/java/groovy person it is!
you’ll need
- java
- groovy
- rsync
- ssh
- cron
I want to regularly copy files from a bunch of paths from several remote hosts to the localhost. I know nothing about arrays etc in bash to configure this stuff in a simple way but the solution I’ve come up with is simple and elegant:
I wrote a little groovy script that generates the shell commands to be executed. the output of the groovy script is piped into bash, which executes the commands. That call of the groovy script with the piping to the bash is in a little shell script which is called from cron.
Tag: Jcenter
Reuse repository definitions in gradle
A project I’m working on has accumulated a bunch of repositories we need for the build.gradle script and for the build itself. I don’t want to keep everything twice so I thought some re-use is in order. Thanks to the excellent gradle javadocs and api design, it was easy to accomplish. The end result is this:
// add your repositories here
buildscript {
repositories {
mavenLocal()
jcenter()
maven {
credentials {
username 'user'
password 'pass'
}
url 'https://example.com/maven2'
}
}
// re-use repositories from buildscript
buildscript.repositories.each { repositories.add(it) }
Tag: Mavencentral
Reuse repository definitions in gradle
A project I’m working on has accumulated a bunch of repositories we need for the build.gradle script and for the build itself. I don’t want to keep everything twice so I thought some re-use is in order. Thanks to the excellent gradle javadocs and api design, it was easy to accomplish. The end result is this:
// add your repositories here
buildscript {
repositories {
mavenLocal()
jcenter()
maven {
credentials {
username 'user'
password 'pass'
}
url 'https://example.com/maven2'
}
}
// re-use repositories from buildscript
buildscript.repositories.each { repositories.add(it) }
Tag: Mavenlocal
Reuse repository definitions in gradle
A project I’m working on has accumulated a bunch of repositories we need for the build.gradle script and for the build itself. I don’t want to keep everything twice so I thought some re-use is in order. Thanks to the excellent gradle javadocs and api design, it was easy to accomplish. The end result is this:
// add your repositories here
buildscript {
repositories {
mavenLocal()
jcenter()
maven {
credentials {
username 'user'
password 'pass'
}
url 'https://example.com/maven2'
}
}
// re-use repositories from buildscript
buildscript.repositories.each { repositories.add(it) }
Tag: Repositories
Reuse repository definitions in gradle
A project I’m working on has accumulated a bunch of repositories we need for the build.gradle script and for the build itself. I don’t want to keep everything twice so I thought some re-use is in order. Thanks to the excellent gradle javadocs and api design, it was easy to accomplish. The end result is this:
// add your repositories here
buildscript {
repositories {
mavenLocal()
jcenter()
maven {
credentials {
username 'user'
password 'pass'
}
url 'https://example.com/maven2'
}
}
// re-use repositories from buildscript
buildscript.repositories.each { repositories.add(it) }
Tag: Reuse
Reuse repository definitions in gradle
A project I’m working on has accumulated a bunch of repositories we need for the build.gradle script and for the build itself. I don’t want to keep everything twice so I thought some re-use is in order. Thanks to the excellent gradle javadocs and api design, it was easy to accomplish. The end result is this:
// add your repositories here
buildscript {
repositories {
mavenLocal()
jcenter()
maven {
credentials {
username 'user'
password 'pass'
}
url 'https://example.com/maven2'
}
}
// re-use repositories from buildscript
buildscript.repositories.each { repositories.add(it) }
Tag: Share
Reuse repository definitions in gradle
A project I’m working on has accumulated a bunch of repositories we need for the build.gradle script and for the build itself. I don’t want to keep everything twice so I thought some re-use is in order. Thanks to the excellent gradle javadocs and api design, it was easy to accomplish. The end result is this:
// add your repositories here
buildscript {
repositories {
mavenLocal()
jcenter()
maven {
credentials {
username 'user'
password 'pass'
}
url 'https://example.com/maven2'
}
}
// re-use repositories from buildscript
buildscript.repositories.each { repositories.add(it) }
Tag: Bash
Dump all (most?) JMX Beans via Jolokia using just the shell and a bit of json formatting
It seems jolokia doesn’t support dumping everything with just one command.
So here’s a really hacky quick and dirty way to get all information jolokia can access:
for name in $(curl --silent http://dwtest:10002/search/*:* | python -m json.tool | grep '"value":' -A9999 | tail -n +2 | head -n -2 | sed 's/ /%20/g' | cut -d'"' -f2);
do curl --silent "http://dwtest:10002/read/$name" | python -m json.tool;
done
Once you’ve gotten this far, piping the information into a file or another tool is trivial.
Executing shell commands from groovy
pre { border: 1px solid #EEE; }
Sometimes you want to run a shell command generated from groovy or just want to achieve something that’s faster/simpler in the shell in comparison to doing it with groovy. One such case would be to get the number of git commits of a repo. The command for this is fairly simple:
$ git log --oneline | wc -l
179
Running shell commands from groovy is really easy too. Make a list and call ’execute()’ on it - how awesome is that?
Using awk to remove lines from a file
A little goodie I’m writing up here to document for my own future reference.
Objective: Remove all lines matching a certrain pattern from all files in a directory structure. The pattern is in the form of “part1.
First try: grep and sed, unfortunately that leaves me with blank lines where the matching content was and I want that line removed completely. Apparently sed can’t do that.
Second try: use awk with regexes. It took me a while and I wanted to use ’next’ on matching lines but that didn’t seem to work so I ended up printing the entire line if the regex didn’t match. Here goes:
Simple python script to access jmx via jolokia
I’ve never been much of a python guy. Now there’s this production environment where we’ve got a few basic things like bash, ruby (since we’re using puppet) and python. Our Java based software has a nice JMX interface with a bunch of lifesavers in it and each JVM has the jolokia agent running (http/json access to jmx) so we can access this goodness fairly easily from scripts and/or the shell. So far what we’d be doing would be something like this:
Quality loss when saving as JPEG
So recently an article was linked on reddit about a bunch of photography myths. Number 7, called “Saving a JPEG multiple times degrades quality” which was apparently proven to be wrong disturbed me most and so I did a quick test on my own.
Further below are Three images. The Original, the one saved 30 times and an XOR of the first Two showing the differences. If you toggle between the first two you can see quite a difference in the color on the car and several other details. Running any more tests at lower than 100% quality is pretty much a waste of time once you see these results.
Tanuki Software's Java Service Wrapper and your Environment Variables
This post has been updated
Here’s an evil little gotcha I ran into when using the Tanuki Software Java Service Wrapper to run our company bamboo server. But first you need to know a little bit of context:
We had our bamboo agent running as root for a long time and about two months ago I changed that. I created a bamboo user and installed multiple agents. Each of these into its own folder along the lines of /home/bamboo/bamboo/java-agent-1
etc. Now in each of these is a wrapper script under bin/bamboo-agent.sh
. Reading up a bit on the wrapper I found out I can change the script and set the variable RUN_AS_USER=bamboo
and then simply create a symlink from /etc/init.d/java-agent-1
to /home/bamboo/bamboo/java-agent-1/bin/bamboo-agent.sh
and then start/stop the service just like any other normal unix service.
Tanuki Software's Java Service Wrapper and your Environment Variables
This post has been updated
Here’s an evil little gotcha I ran into when using the Tanuki Software Java Service Wrapper to run our company bamboo server. But first you need to know a little bit of context:
We had our bamboo agent running as root for a long time and about two months ago I changed that. I created a bamboo user and installed multiple agents. Each of these into its own folder along the lines of /home/bamboo/bamboo/java-agent-1
etc. Now in each of these is a wrapper script under bin/bamboo-agent.sh
. Reading up a bit on the wrapper I found out I can change the script and set the variable RUN_AS_USER=bamboo
and then simply create a symlink from /etc/init.d/java-agent-1
to /home/bamboo/bamboo/java-agent-1/bin/bamboo-agent.sh
and then start/stop the service just like any other normal unix service.
The simplest way to create pdf's from images
There’s probably no simpler way to create a pdf from images. Behold:
convert \*.jpg output.pdf
That’s all folks!
The simplest way to create pdf's from images
There’s probably no simpler way to create a pdf from images. Behold:
convert \*.jpg output.pdf
That’s all folks!
Tag: Jmx
Dump all (most?) JMX Beans via Jolokia using just the shell and a bit of json formatting
It seems jolokia doesn’t support dumping everything with just one command.
So here’s a really hacky quick and dirty way to get all information jolokia can access:
for name in $(curl --silent http://dwtest:10002/search/*:* | python -m json.tool | grep '"value":' -A9999 | tail -n +2 | head -n -2 | sed 's/ /%20/g' | cut -d'"' -f2);
do curl --silent "http://dwtest:10002/read/$name" | python -m json.tool;
done
Once you’ve gotten this far, piping the information into a file or another tool is trivial.
Simple python script to access jmx via jolokia
I’ve never been much of a python guy. Now there’s this production environment where we’ve got a few basic things like bash, ruby (since we’re using puppet) and python. Our Java based software has a nice JMX interface with a bunch of lifesavers in it and each JVM has the jolokia agent running (http/json access to jmx) so we can access this goodness fairly easily from scripts and/or the shell. So far what we’d be doing would be something like this:
Tag: Jolokia
Dump all (most?) JMX Beans via Jolokia using just the shell and a bit of json formatting
It seems jolokia doesn’t support dumping everything with just one command.
So here’s a really hacky quick and dirty way to get all information jolokia can access:
for name in $(curl --silent http://dwtest:10002/search/*:* | python -m json.tool | grep '"value":' -A9999 | tail -n +2 | head -n -2 | sed 's/ /%20/g' | cut -d'"' -f2);
do curl --silent "http://dwtest:10002/read/$name" | python -m json.tool;
done
Once you’ve gotten this far, piping the information into a file or another tool is trivial.
Simple python script to access jmx via jolokia
I’ve never been much of a python guy. Now there’s this production environment where we’ve got a few basic things like bash, ruby (since we’re using puppet) and python. Our Java based software has a nice JMX interface with a bunch of lifesavers in it and each JVM has the jolokia agent running (http/json access to jmx) so we can access this goodness fairly easily from scripts and/or the shell. So far what we’d be doing would be something like this:
Tag: Json
Dump all (most?) JMX Beans via Jolokia using just the shell and a bit of json formatting
It seems jolokia doesn’t support dumping everything with just one command.
So here’s a really hacky quick and dirty way to get all information jolokia can access:
for name in $(curl --silent http://dwtest:10002/search/*:* | python -m json.tool | grep '"value":' -A9999 | tail -n +2 | head -n -2 | sed 's/ /%20/g' | cut -d'"' -f2);
do curl --silent "http://dwtest:10002/read/$name" | python -m json.tool;
done
Once you’ve gotten this far, piping the information into a file or another tool is trivial.
Simple python script to access jmx via jolokia
I’ve never been much of a python guy. Now there’s this production environment where we’ve got a few basic things like bash, ruby (since we’re using puppet) and python. Our Java based software has a nice JMX interface with a bunch of lifesavers in it and each JVM has the jolokia agent running (http/json access to jmx) so we can access this goodness fairly easily from scripts and/or the shell. So far what we’d be doing would be something like this:
Tag: Shell
Dump all (most?) JMX Beans via Jolokia using just the shell and a bit of json formatting
It seems jolokia doesn’t support dumping everything with just one command.
So here’s a really hacky quick and dirty way to get all information jolokia can access:
for name in $(curl --silent http://dwtest:10002/search/*:* | python -m json.tool | grep '"value":' -A9999 | tail -n +2 | head -n -2 | sed 's/ /%20/g' | cut -d'"' -f2);
do curl --silent "http://dwtest:10002/read/$name" | python -m json.tool;
done
Once you’ve gotten this far, piping the information into a file or another tool is trivial.
Executing shell commands from groovy
pre { border: 1px solid #EEE; }
Sometimes you want to run a shell command generated from groovy or just want to achieve something that’s faster/simpler in the shell in comparison to doing it with groovy. One such case would be to get the number of git commits of a repo. The command for this is fairly simple:
$ git log --oneline | wc -l
179
Running shell commands from groovy is really easy too. Make a list and call ’execute()’ on it - how awesome is that?
Installing/Upgrading flash on Ubuntu behind a proxy
This just refused to work because the flashplugin-installer that actually downloads the packet doesn’t honor the http_proxy environment variable to I had to do a little hack to get this going. It’s actually pretty simple: download the file and serve it from localhost so we don’t need the proxy at all.
First shell:
- mkdir /tmp/flash
- cd /tmp/flash
- wget http://archive.canonical.com/pool/partner/a/adobe-flashplugin/adobe-flashplugin_11.2.202.310.orig.tar.gz
==> or whatever version YOU need - python -m SimpleHTTPServer 80
Second shell:
Always test your Puppet Modules
Always test your puppet modules. Even if it’s just a little smoke test like this:
class { my\_module: }
Now to make this one better, create a job on your continuous integration server and let it run the following script inside the modules directory of your puppet project:
#!/bin/bash
#DEBUG="--verbose --debug"
hash puppet 2>/dev/null || { echo >&2 "Please install puppet"; exit 2; }
[ -z "$(facter | grep fqdn)" ] && { echo >&2 "Your machine has no FQDN (according to facter), some tests may fail or print warnings"; sleep 5; }
for dir in $(find . -type d -name tests); do
for file in $(find ${dir} -name '*.pp'); do
echo ">>> TESTING ${file}"
puppet apply ${DEBUG} --modulepath modules --noop "${file}" || { echo ">>> ERROR" ; HAS_FAILURES="true" ; }
echo "------------------------------------------------------------------------------"
done
done
if [ "${HAS_FAILURES}" = "true" ]; then
exit 1
fi
It looks for all ‘*.pp’ files inside all ’tests’ directories and does a simple ‘puppet apply’ call. It may not be perfect but it’s small, simple, it works and it’ll save you some gray hairs.
Using awk to remove lines from a file
A little goodie I’m writing up here to document for my own future reference.
Objective: Remove all lines matching a certrain pattern from all files in a directory structure. The pattern is in the form of “part1.
First try: grep and sed, unfortunately that leaves me with blank lines where the matching content was and I want that line removed completely. Apparently sed can’t do that.
Second try: use awk with regexes. It took me a while and I wanted to use ’next’ on matching lines but that didn’t seem to work so I ended up printing the entire line if the regex didn’t match. Here goes:
Simple python script to access jmx via jolokia
I’ve never been much of a python guy. Now there’s this production environment where we’ve got a few basic things like bash, ruby (since we’re using puppet) and python. Our Java based software has a nice JMX interface with a bunch of lifesavers in it and each JVM has the jolokia agent running (http/json access to jmx) so we can access this goodness fairly easily from scripts and/or the shell. So far what we’d be doing would be something like this:
Quality loss when saving as JPEG
So recently an article was linked on reddit about a bunch of photography myths. Number 7, called “Saving a JPEG multiple times degrades quality” which was apparently proven to be wrong disturbed me most and so I did a quick test on my own.
Further below are Three images. The Original, the one saved 30 times and an XOR of the first Two showing the differences. If you toggle between the first two you can see quite a difference in the color on the car and several other details. Running any more tests at lower than 100% quality is pretty much a waste of time once you see these results.
Tanuki Software's Java Service Wrapper and your Environment Variables
This post has been updated
Here’s an evil little gotcha I ran into when using the Tanuki Software Java Service Wrapper to run our company bamboo server. But first you need to know a little bit of context:
We had our bamboo agent running as root for a long time and about two months ago I changed that. I created a bamboo user and installed multiple agents. Each of these into its own folder along the lines of /home/bamboo/bamboo/java-agent-1
etc. Now in each of these is a wrapper script under bin/bamboo-agent.sh
. Reading up a bit on the wrapper I found out I can change the script and set the variable RUN_AS_USER=bamboo
and then simply create a symlink from /etc/init.d/java-agent-1
to /home/bamboo/bamboo/java-agent-1/bin/bamboo-agent.sh
and then start/stop the service just like any other normal unix service.
Tanuki Software's Java Service Wrapper and your Environment Variables
This post has been updated
Here’s an evil little gotcha I ran into when using the Tanuki Software Java Service Wrapper to run our company bamboo server. But first you need to know a little bit of context:
We had our bamboo agent running as root for a long time and about two months ago I changed that. I created a bamboo user and installed multiple agents. Each of these into its own folder along the lines of /home/bamboo/bamboo/java-agent-1
etc. Now in each of these is a wrapper script under bin/bamboo-agent.sh
. Reading up a bit on the wrapper I found out I can change the script and set the variable RUN_AS_USER=bamboo
and then simply create a symlink from /etc/init.d/java-agent-1
to /home/bamboo/bamboo/java-agent-1/bin/bamboo-agent.sh
and then start/stop the service just like any other normal unix service.
Automatically switch your mvn settings
I have one primary development notebook and take that with me wherever I go. Now in a company setting you usually have proxies and whatnot. And I think it’s an official best practice to roll your own company or departement maven proxy/cache.
So dependent on where you are, you might need a different maven ~/.m2/settings.xml file. Here’s a very simple shell function you can add to your ~/.bashrc
function mvn {
MVN="$(which mvn)"
if [ -n "$(ifconfig eth0 | grep YOUR-WORK-IP-HERE)" ]; then
echo ">>> Running mvn whith work config"
${MVN} -gs ${HOME}/.m2/settings-work.xml $*
else
echo ">>> Running mvn with vanilla config"
${MVN} $*
fi
}
This just checks for the ip of eth0 and calls mvn with a special settings.xml. In all other cases mvn is run with the vanilla config (or none, since the settings.xml is optional).
Automatically switch your mvn settings
I have one primary development notebook and take that with me wherever I go. Now in a company setting you usually have proxies and whatnot. And I think it’s an official best practice to roll your own company or departement maven proxy/cache.
So dependent on where you are, you might need a different maven ~/.m2/settings.xml file. Here’s a very simple shell function you can add to your ~/.bashrc
function mvn {
MVN="$(which mvn)"
if [ -n "$(ifconfig eth0 | grep YOUR-WORK-IP-HERE)" ]; then
echo ">>> Running mvn whith work config"
${MVN} -gs ${HOME}/.m2/settings-work.xml $*
else
echo ">>> Running mvn with vanilla config"
${MVN} $*
fi
}
This just checks for the ip of eth0 and calls mvn with a special settings.xml. In all other cases mvn is run with the vanilla config (or none, since the settings.xml is optional).
The simplest way to create pdf's from images
There’s probably no simpler way to create a pdf from images. Behold:
convert \*.jpg output.pdf
That’s all folks!
The simplest way to create pdf's from images
There’s probably no simpler way to create a pdf from images. Behold:
convert \*.jpg output.pdf
That’s all folks!
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Tag: Command
Executing shell commands from groovy
pre { border: 1px solid #EEE; }
Sometimes you want to run a shell command generated from groovy or just want to achieve something that’s faster/simpler in the shell in comparison to doing it with groovy. One such case would be to get the number of git commits of a repo. The command for this is fairly simple:
$ git log --oneline | wc -l
179
Running shell commands from groovy is really easy too. Make a list and call ’execute()’ on it - how awesome is that?
Tag: Exeucte
Executing shell commands from groovy
pre { border: 1px solid #EEE; }
Sometimes you want to run a shell command generated from groovy or just want to achieve something that’s faster/simpler in the shell in comparison to doing it with groovy. One such case would be to get the number of git commits of a repo. The command for this is fairly simple:
$ git log --oneline | wc -l
179
Running shell commands from groovy is really easy too. Make a list and call ’execute()’ on it - how awesome is that?
Tag: Process
Executing shell commands from groovy
pre { border: 1px solid #EEE; }
Sometimes you want to run a shell command generated from groovy or just want to achieve something that’s faster/simpler in the shell in comparison to doing it with groovy. One such case would be to get the number of git commits of a repo. The command for this is fairly simple:
$ git log --oneline | wc -l
179
Running shell commands from groovy is really easy too. Make a list and call ’execute()’ on it - how awesome is that?
Tag: Sh
Executing shell commands from groovy
pre { border: 1px solid #EEE; }
Sometimes you want to run a shell command generated from groovy or just want to achieve something that’s faster/simpler in the shell in comparison to doing it with groovy. One such case would be to get the number of git commits of a repo. The command for this is fairly simple:
$ git log --oneline | wc -l
179
Running shell commands from groovy is really easy too. Make a list and call ’execute()’ on it - how awesome is that?
Tag: System
Executing shell commands from groovy
pre { border: 1px solid #EEE; }
Sometimes you want to run a shell command generated from groovy or just want to achieve something that’s faster/simpler in the shell in comparison to doing it with groovy. One such case would be to get the number of git commits of a repo. The command for this is fairly simple:
$ git log --oneline | wc -l
179
Running shell commands from groovy is really easy too. Make a list and call ’execute()’ on it - how awesome is that?
Tag: Btrfs
How to fix file permissions on debian
It just so happens I recently migrated a linux system from ext4 to btrfs and didn’t want to do a fresh install. So make a new partition, format it and copy the system over… Too bad I forgot to tell ‘cp’ to preserve the file permissions. I ended up with a system where every last file was owned by root and that obviously causes problem with everything that isn’t executed as root. And of course, the old partition was already gone - bad luck. Reinstalling is pretty much the only ‘solution’ I could find for this problem on the web. What follows are a few simple steps to resurrect a system to at least a runnable / good-enough state:
Tag: Ext4
How to fix file permissions on debian
It just so happens I recently migrated a linux system from ext4 to btrfs and didn’t want to do a fresh install. So make a new partition, format it and copy the system over… Too bad I forgot to tell ‘cp’ to preserve the file permissions. I ended up with a system where every last file was owned by root and that obviously causes problem with everything that isn’t executed as root. And of course, the old partition was already gone - bad luck. Reinstalling is pretty much the only ‘solution’ I could find for this problem on the web. What follows are a few simple steps to resurrect a system to at least a runnable / good-enough state:
Tag: Linux
How to fix file permissions on debian
It just so happens I recently migrated a linux system from ext4 to btrfs and didn’t want to do a fresh install. So make a new partition, format it and copy the system over… Too bad I forgot to tell ‘cp’ to preserve the file permissions. I ended up with a system where every last file was owned by root and that obviously causes problem with everything that isn’t executed as root. And of course, the old partition was already gone - bad luck. Reinstalling is pretty much the only ‘solution’ I could find for this problem on the web. What follows are a few simple steps to resurrect a system to at least a runnable / good-enough state:
Migrating Sonar to a different host
Just a few notes on how to move your sonar installation to another machine without loosing any of your config and/or history. Some of that include the normal installation steps too. This list is specific to debian.
Add the sonar debian repo to your machine (/etc/apt/sources.list.d/sonar)
deb http://downloads.sourceforge.net/project/sonar-pkg/deb binary/
Update and install sonar (sonar pkg is not signed):
aptitude update && aptitude install sonar
Install postgresql and set up an account for your sonar:
Installing/Upgrading flash on Ubuntu behind a proxy
This just refused to work because the flashplugin-installer that actually downloads the packet doesn’t honor the http_proxy environment variable to I had to do a little hack to get this going. It’s actually pretty simple: download the file and serve it from localhost so we don’t need the proxy at all.
First shell:
- mkdir /tmp/flash
- cd /tmp/flash
- wget http://archive.canonical.com/pool/partner/a/adobe-flashplugin/adobe-flashplugin_11.2.202.310.orig.tar.gz
==> or whatever version YOU need - python -m SimpleHTTPServer 80
Second shell:
How to rename/merge projects in sonar
Sonar doesn’t support merging of projects out of the box so when you happen to rename a project as I just did, you’re suddenly stuck with 2 projects of the same name unless you do a little bit of sql trickery in the back. Here’s how you do it (quick & dirty approach). What we’re going to to is: rename the old projects (with all the juicy metrics) to the new groupid/artifactid and delete the new one (which only has a handful of builds anyway and we can afford to loose these, but not the yearlong history we’ve collected):
How to mount an iPod in linux
I’m currently trying to get rid of my iPod and move all my music to my android phone. Needless to say my iTunes installation on the PC is long gone and these iDevices are the proverbial bottomless bit - you’ll never get anything back out of them, or do you?
Thanks to the libimobiledevice and ifuse projects, you can now mount an ipod like any normal usb stick in linux, and it doesn’t even have to be a rooted device.
First steps on a new postgresql installation
I’ve found myself trying postgres over mysql for a few things and had some trouble accessing the db server at all at first. So here’s how you first access the server:
$ su - # su to root
# su - postgres # su to postgres
$ psql # access the server
That wasn’t so hard now was it? Once I’m in the psql shell, I tend to issue the following SQL statement so my primary user has unlimited access, to make things easier for development:
Resizing VirtualBox Drives
My Virtualbox images are usually fairly small. Here’s how to increase the size of a vbox disk, to 8G in this case:
vboxmanage modifyhd /home/user/vbox/vm/vm.vdi --resize 8192
After that, remove the disk from the vm it’s currently attached to, hook it to a new vm, which will boot from an iso file. Once booted, start gparted. It will complain that the gpt partition table is not (anymore) at the end of the disk, ignore that message and you’ll see the full disk with the initial partition plus additional space at the end. Resize the partition on the disk or create new partitions, whatever you like. Save the changes, stop the vm, move the drive back to the initial vm and start that one again.
Using awk to remove lines from a file
A little goodie I’m writing up here to document for my own future reference.
Objective: Remove all lines matching a certrain pattern from all files in a directory structure. The pattern is in the form of “part1.
First try: grep and sed, unfortunately that leaves me with blank lines where the matching content was and I want that line removed completely. Apparently sed can’t do that.
Second try: use awk with regexes. It took me a while and I wanted to use ’next’ on matching lines but that didn’t seem to work so I ended up printing the entire line if the regex didn’t match. Here goes:
Quality loss when saving as JPEG
So recently an article was linked on reddit about a bunch of photography myths. Number 7, called “Saving a JPEG multiple times degrades quality” which was apparently proven to be wrong disturbed me most and so I did a quick test on my own.
Further below are Three images. The Original, the one saved 30 times and an XOR of the first Two showing the differences. If you toggle between the first two you can see quite a difference in the color on the car and several other details. Running any more tests at lower than 100% quality is pretty much a waste of time once you see these results.
Tanuki Software's Java Service Wrapper and your Environment Variables
This post has been updated
Here’s an evil little gotcha I ran into when using the Tanuki Software Java Service Wrapper to run our company bamboo server. But first you need to know a little bit of context:
We had our bamboo agent running as root for a long time and about two months ago I changed that. I created a bamboo user and installed multiple agents. Each of these into its own folder along the lines of /home/bamboo/bamboo/java-agent-1
etc. Now in each of these is a wrapper script under bin/bamboo-agent.sh
. Reading up a bit on the wrapper I found out I can change the script and set the variable RUN_AS_USER=bamboo
and then simply create a symlink from /etc/init.d/java-agent-1
to /home/bamboo/bamboo/java-agent-1/bin/bamboo-agent.sh
and then start/stop the service just like any other normal unix service.
Tanuki Software's Java Service Wrapper and your Environment Variables
This post has been updated
Here’s an evil little gotcha I ran into when using the Tanuki Software Java Service Wrapper to run our company bamboo server. But first you need to know a little bit of context:
We had our bamboo agent running as root for a long time and about two months ago I changed that. I created a bamboo user and installed multiple agents. Each of these into its own folder along the lines of /home/bamboo/bamboo/java-agent-1
etc. Now in each of these is a wrapper script under bin/bamboo-agent.sh
. Reading up a bit on the wrapper I found out I can change the script and set the variable RUN_AS_USER=bamboo
and then simply create a symlink from /etc/init.d/java-agent-1
to /home/bamboo/bamboo/java-agent-1/bin/bamboo-agent.sh
and then start/stop the service just like any other normal unix service.
Automatically switch your mvn settings
I have one primary development notebook and take that with me wherever I go. Now in a company setting you usually have proxies and whatnot. And I think it’s an official best practice to roll your own company or departement maven proxy/cache.
So dependent on where you are, you might need a different maven ~/.m2/settings.xml file. Here’s a very simple shell function you can add to your ~/.bashrc
function mvn {
MVN="$(which mvn)"
if [ -n "$(ifconfig eth0 | grep YOUR-WORK-IP-HERE)" ]; then
echo ">>> Running mvn whith work config"
${MVN} -gs ${HOME}/.m2/settings-work.xml $*
else
echo ">>> Running mvn with vanilla config"
${MVN} $*
fi
}
This just checks for the ip of eth0 and calls mvn with a special settings.xml. In all other cases mvn is run with the vanilla config (or none, since the settings.xml is optional).
Automatically switch your mvn settings
I have one primary development notebook and take that with me wherever I go. Now in a company setting you usually have proxies and whatnot. And I think it’s an official best practice to roll your own company or departement maven proxy/cache.
So dependent on where you are, you might need a different maven ~/.m2/settings.xml file. Here’s a very simple shell function you can add to your ~/.bashrc
function mvn {
MVN="$(which mvn)"
if [ -n "$(ifconfig eth0 | grep YOUR-WORK-IP-HERE)" ]; then
echo ">>> Running mvn whith work config"
${MVN} -gs ${HOME}/.m2/settings-work.xml $*
else
echo ">>> Running mvn with vanilla config"
${MVN} $*
fi
}
This just checks for the ip of eth0 and calls mvn with a special settings.xml. In all other cases mvn is run with the vanilla config (or none, since the settings.xml is optional).
Mercurial and self-signed server certificates
So mercurial aborts when you want to interact with a repository that uses a self-signed certificate, as is the case for my own little mercurial repo exposed over https.
NOTE: this is obviously insecure and you must verify the ssl cert’s fingerprint is correct. If you roll your own server, log into the server and get the fingerprint from the cert file itself, not over https since there could be a man in the middle.
Mercurial and self-signed server certificates
So mercurial aborts when you want to interact with a repository that uses a self-signed certificate, as is the case for my own little mercurial repo exposed over https.
NOTE: this is obviously insecure and you must verify the ssl cert’s fingerprint is correct. If you roll your own server, log into the server and get the fingerprint from the cert file itself, not over https since there could be a man in the middle.
The simplest way to create pdf's from images
There’s probably no simpler way to create a pdf from images. Behold:
convert \*.jpg output.pdf
That’s all folks!
The simplest way to create pdf's from images
There’s probably no simpler way to create a pdf from images. Behold:
convert \*.jpg output.pdf
That’s all folks!
Using StarCluster for some heavy computing
Update: 2012-10-22
I recently put the scripts I used in a little github repo called starcluster-image-sharpener. It’s nothing big but it’ll get you started if needed. Go get them! ;)
Introduction
First a little context:
I’ve photographed a ton recently and on one occasion I screwed things up a bit and had a lot of slightly out of focus images. I’ve tinkered with the Canon Digital Photo Professional settings to find a good level of sharpening (after noise reduction because a lot was shot at high iso (for me and my 50D, iso 1250 is a lot already)) but wasn’t happy with it. I’ve found reasonable settings for noise reduction, which makes the image a bit softer, but sharpening wouldn’t do anything to compensate let alone actually sharpen.
Using StarCluster for some heavy computing
Update: 2012-10-22
I recently put the scripts I used in a little github repo called starcluster-image-sharpener. It’s nothing big but it’ll get you started if needed. Go get them! ;)
Introduction
First a little context:
I’ve photographed a ton recently and on one occasion I screwed things up a bit and had a lot of slightly out of focus images. I’ve tinkered with the Canon Digital Photo Professional settings to find a good level of sharpening (after noise reduction because a lot was shot at high iso (for me and my 50D, iso 1250 is a lot already)) but wasn’t happy with it. I’ve found reasonable settings for noise reduction, which makes the image a bit softer, but sharpening wouldn’t do anything to compensate let alone actually sharpen.
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
The simplest backup solution ever...
… not for everyone maybe, but for me as a linux/java/groovy person it is!
you’ll need
- java
- groovy
- rsync
- ssh
- cron
I want to regularly copy files from a bunch of paths from several remote hosts to the localhost. I know nothing about arrays etc in bash to configure this stuff in a simple way but the solution I’ve come up with is simple and elegant:
I wrote a little groovy script that generates the shell commands to be executed. the output of the groovy script is piped into bash, which executes the commands. That call of the groovy script with the piping to the bash is in a little shell script which is called from cron.
Tag: Permission
How to fix file permissions on debian
It just so happens I recently migrated a linux system from ext4 to btrfs and didn’t want to do a fresh install. So make a new partition, format it and copy the system over… Too bad I forgot to tell ‘cp’ to preserve the file permissions. I ended up with a system where every last file was owned by root and that obviously causes problem with everything that isn’t executed as root. And of course, the old partition was already gone - bad luck. Reinstalling is pretty much the only ‘solution’ I could find for this problem on the web. What follows are a few simple steps to resurrect a system to at least a runnable / good-enough state:
Tag: Restore
How to fix file permissions on debian
It just so happens I recently migrated a linux system from ext4 to btrfs and didn’t want to do a fresh install. So make a new partition, format it and copy the system over… Too bad I forgot to tell ‘cp’ to preserve the file permissions. I ended up with a system where every last file was owned by root and that obviously causes problem with everything that isn’t executed as root. And of course, the old partition was already gone - bad luck. Reinstalling is pretty much the only ‘solution’ I could find for this problem on the web. What follows are a few simple steps to resurrect a system to at least a runnable / good-enough state:
Tag: Resurrect
How to fix file permissions on debian
It just so happens I recently migrated a linux system from ext4 to btrfs and didn’t want to do a fresh install. So make a new partition, format it and copy the system over… Too bad I forgot to tell ‘cp’ to preserve the file permissions. I ended up with a system where every last file was owned by root and that obviously causes problem with everything that isn’t executed as root. And of course, the old partition was already gone - bad luck. Reinstalling is pretty much the only ‘solution’ I could find for this problem on the web. What follows are a few simple steps to resurrect a system to at least a runnable / good-enough state:
Tag: Migration
Migrating Sonar to a different host
Just a few notes on how to move your sonar installation to another machine without loosing any of your config and/or history. Some of that include the normal installation steps too. This list is specific to debian.
Add the sonar debian repo to your machine (/etc/apt/sources.list.d/sonar)
deb http://downloads.sourceforge.net/project/sonar-pkg/deb binary/
Update and install sonar (sonar pkg is not signed):
aptitude update && aptitude install sonar
Install postgresql and set up an account for your sonar:
Tag: Postgresql
Migrating Sonar to a different host
Just a few notes on how to move your sonar installation to another machine without loosing any of your config and/or history. Some of that include the normal installation steps too. This list is specific to debian.
Add the sonar debian repo to your machine (/etc/apt/sources.list.d/sonar)
deb http://downloads.sourceforge.net/project/sonar-pkg/deb binary/
Update and install sonar (sonar pkg is not signed):
aptitude update && aptitude install sonar
Install postgresql and set up an account for your sonar:
Tag: Sonar
Migrating Sonar to a different host
Just a few notes on how to move your sonar installation to another machine without loosing any of your config and/or history. Some of that include the normal installation steps too. This list is specific to debian.
Add the sonar debian repo to your machine (/etc/apt/sources.list.d/sonar)
deb http://downloads.sourceforge.net/project/sonar-pkg/deb binary/
Update and install sonar (sonar pkg is not signed):
aptitude update && aptitude install sonar
Install postgresql and set up an account for your sonar:
How to rename/merge projects in sonar
Sonar doesn’t support merging of projects out of the box so when you happen to rename a project as I just did, you’re suddenly stuck with 2 projects of the same name unless you do a little bit of sql trickery in the back. Here’s how you do it (quick & dirty approach). What we’re going to to is: rename the old projects (with all the juicy metrics) to the new groupid/artifactid and delete the new one (which only has a handful of builds anyway and we can afford to loose these, but not the yearlong history we’ve collected):
Tag: Adobe
Installing/Upgrading flash on Ubuntu behind a proxy
This just refused to work because the flashplugin-installer that actually downloads the packet doesn’t honor the http_proxy environment variable to I had to do a little hack to get this going. It’s actually pretty simple: download the file and serve it from localhost so we don’t need the proxy at all.
First shell:
- mkdir /tmp/flash
- cd /tmp/flash
- wget http://archive.canonical.com/pool/partner/a/adobe-flashplugin/adobe-flashplugin_11.2.202.310.orig.tar.gz
==> or whatever version YOU need - python -m SimpleHTTPServer 80
Second shell:
Tag: Apt
Installing/Upgrading flash on Ubuntu behind a proxy
This just refused to work because the flashplugin-installer that actually downloads the packet doesn’t honor the http_proxy environment variable to I had to do a little hack to get this going. It’s actually pretty simple: download the file and serve it from localhost so we don’t need the proxy at all.
First shell:
- mkdir /tmp/flash
- cd /tmp/flash
- wget http://archive.canonical.com/pool/partner/a/adobe-flashplugin/adobe-flashplugin_11.2.202.310.orig.tar.gz
==> or whatever version YOU need - python -m SimpleHTTPServer 80
Second shell:
Tag: Apt-Get
Installing/Upgrading flash on Ubuntu behind a proxy
This just refused to work because the flashplugin-installer that actually downloads the packet doesn’t honor the http_proxy environment variable to I had to do a little hack to get this going. It’s actually pretty simple: download the file and serve it from localhost so we don’t need the proxy at all.
First shell:
- mkdir /tmp/flash
- cd /tmp/flash
- wget http://archive.canonical.com/pool/partner/a/adobe-flashplugin/adobe-flashplugin_11.2.202.310.orig.tar.gz
==> or whatever version YOU need - python -m SimpleHTTPServer 80
Second shell:
Tag: Aptitude
Installing/Upgrading flash on Ubuntu behind a proxy
This just refused to work because the flashplugin-installer that actually downloads the packet doesn’t honor the http_proxy environment variable to I had to do a little hack to get this going. It’s actually pretty simple: download the file and serve it from localhost so we don’t need the proxy at all.
First shell:
- mkdir /tmp/flash
- cd /tmp/flash
- wget http://archive.canonical.com/pool/partner/a/adobe-flashplugin/adobe-flashplugin_11.2.202.310.orig.tar.gz
==> or whatever version YOU need - python -m SimpleHTTPServer 80
Second shell:
Tag: Flash
Installing/Upgrading flash on Ubuntu behind a proxy
This just refused to work because the flashplugin-installer that actually downloads the packet doesn’t honor the http_proxy environment variable to I had to do a little hack to get this going. It’s actually pretty simple: download the file and serve it from localhost so we don’t need the proxy at all.
First shell:
- mkdir /tmp/flash
- cd /tmp/flash
- wget http://archive.canonical.com/pool/partner/a/adobe-flashplugin/adobe-flashplugin_11.2.202.310.orig.tar.gz
==> or whatever version YOU need - python -m SimpleHTTPServer 80
Second shell:
Tag: Flashplugin
Installing/Upgrading flash on Ubuntu behind a proxy
This just refused to work because the flashplugin-installer that actually downloads the packet doesn’t honor the http_proxy environment variable to I had to do a little hack to get this going. It’s actually pretty simple: download the file and serve it from localhost so we don’t need the proxy at all.
First shell:
- mkdir /tmp/flash
- cd /tmp/flash
- wget http://archive.canonical.com/pool/partner/a/adobe-flashplugin/adobe-flashplugin_11.2.202.310.orig.tar.gz
==> or whatever version YOU need - python -m SimpleHTTPServer 80
Second shell:
Tag: Install
Installing/Upgrading flash on Ubuntu behind a proxy
This just refused to work because the flashplugin-installer that actually downloads the packet doesn’t honor the http_proxy environment variable to I had to do a little hack to get this going. It’s actually pretty simple: download the file and serve it from localhost so we don’t need the proxy at all.
First shell:
- mkdir /tmp/flash
- cd /tmp/flash
- wget http://archive.canonical.com/pool/partner/a/adobe-flashplugin/adobe-flashplugin_11.2.202.310.orig.tar.gz
==> or whatever version YOU need - python -m SimpleHTTPServer 80
Second shell:
Tag: Plugin
Installing/Upgrading flash on Ubuntu behind a proxy
This just refused to work because the flashplugin-installer that actually downloads the packet doesn’t honor the http_proxy environment variable to I had to do a little hack to get this going. It’s actually pretty simple: download the file and serve it from localhost so we don’t need the proxy at all.
First shell:
- mkdir /tmp/flash
- cd /tmp/flash
- wget http://archive.canonical.com/pool/partner/a/adobe-flashplugin/adobe-flashplugin_11.2.202.310.orig.tar.gz
==> or whatever version YOU need - python -m SimpleHTTPServer 80
Second shell:
Tag: Backup
Reclaiming you Music from an iPod
A while ago I wrote a quick tip on how to mount an ipod in linux, I did this while working on a python script that would retrieve all my music from my ipod and put it back into a file structure of my choosing with file and directory names that a human being can understand. The reason for this is simply the fact that I kept rating music over time and ended up rating everything - an information I don’t want to loose as I migrate all music onto my phone.
GMail Backup
Considering that Google just lost tons of their user’s E-Mails, you might want to do a GMail Backup. Also, let’s see how this will develop ;-)
Tag: Fuse
Reclaiming you Music from an iPod
A while ago I wrote a quick tip on how to mount an ipod in linux, I did this while working on a python script that would retrieve all my music from my ipod and put it back into a file structure of my choosing with file and directory names that a human being can understand. The reason for this is simply the fact that I kept rating music over time and ended up rating everything - an information I don’t want to loose as I migrate all music onto my phone.
Tag: Mp3
Reclaiming you Music from an iPod
A while ago I wrote a quick tip on how to mount an ipod in linux, I did this while working on a python script that would retrieve all my music from my ipod and put it back into a file structure of my choosing with file and directory names that a human being can understand. The reason for this is simply the fact that I kept rating music over time and ended up rating everything - an information I don’t want to loose as I migrate all music onto my phone.
Tag: Music
Reclaiming you Music from an iPod
A while ago I wrote a quick tip on how to mount an ipod in linux, I did this while working on a python script that would retrieve all my music from my ipod and put it back into a file structure of my choosing with file and directory names that a human being can understand. The reason for this is simply the fact that I kept rating music over time and ended up rating everything - an information I don’t want to loose as I migrate all music onto my phone.
Tag: Python
Reclaiming you Music from an iPod
A while ago I wrote a quick tip on how to mount an ipod in linux, I did this while working on a python script that would retrieve all my music from my ipod and put it back into a file structure of my choosing with file and directory names that a human being can understand. The reason for this is simply the fact that I kept rating music over time and ended up rating everything - an information I don’t want to loose as I migrate all music onto my phone.
Simple python script to access jmx via jolokia
I’ve never been much of a python guy. Now there’s this production environment where we’ve got a few basic things like bash, ruby (since we’re using puppet) and python. Our Java based software has a nice JMX interface with a bunch of lifesavers in it and each JVM has the jolokia agent running (http/json access to jmx) so we can access this goodness fairly easily from scripts and/or the shell. So far what we’d be doing would be something like this:
Tag: Sqlite
Reclaiming you Music from an iPod
A while ago I wrote a quick tip on how to mount an ipod in linux, I did this while working on a python script that would retrieve all my music from my ipod and put it back into a file structure of my choosing with file and directory names that a human being can understand. The reason for this is simply the fact that I kept rating music over time and ended up rating everything - an information I don’t want to loose as I migrate all music onto my phone.
Tag: Opensource
How to rename/merge projects in sonar
Sonar doesn’t support merging of projects out of the box so when you happen to rename a project as I just did, you’re suddenly stuck with 2 projects of the same name unless you do a little bit of sql trickery in the back. Here’s how you do it (quick & dirty approach). What we’re going to to is: rename the old projects (with all the juicy metrics) to the new groupid/artifactid and delete the new one (which only has a handful of builds anyway and we can afford to loose these, but not the yearlong history we’ve collected):
Tag: Postgres
How to rename/merge projects in sonar
Sonar doesn’t support merging of projects out of the box so when you happen to rename a project as I just did, you’re suddenly stuck with 2 projects of the same name unless you do a little bit of sql trickery in the back. Here’s how you do it (quick & dirty approach). What we’re going to to is: rename the old projects (with all the juicy metrics) to the new groupid/artifactid and delete the new one (which only has a handful of builds anyway and we can afford to loose these, but not the yearlong history we’ve collected):
First steps on a new postgresql installation
I’ve found myself trying postgres over mysql for a few things and had some trouble accessing the db server at all at first. So here’s how you first access the server:
$ su - # su to root
# su - postgres # su to postgres
$ psql # access the server
That wasn’t so hard now was it? Once I’m in the psql shell, I tend to issue the following SQL statement so my primary user has unlimited access, to make things easier for development:
Tag: Sql
How to rename/merge projects in sonar
Sonar doesn’t support merging of projects out of the box so when you happen to rename a project as I just did, you’re suddenly stuck with 2 projects of the same name unless you do a little bit of sql trickery in the back. Here’s how you do it (quick & dirty approach). What we’re going to to is: rename the old projects (with all the juicy metrics) to the new groupid/artifactid and delete the new one (which only has a handful of builds anyway and we can afford to loose these, but not the yearlong history we’ve collected):
Tag: Daemon
To gradle daemon or not to gradle daemon?
The gradle daemon speeds up builds quite a bit and you do want to have it running on your local machine but not on the build server, that one should always re-build from scratch. This is actually quite easy to accomplish.
In your ‘~/.bashrc’ add the following export so your local machine runs all builds with the gradle daemon:
export GRADLE\_OPTS="-Dorg.gradle.daemon=true"
Since, by default, the gradle daemon is not started, it will not be used on your build server.
Tag: Ifuse
How to mount an iPod in linux
I’m currently trying to get rid of my iPod and move all my music to my android phone. Needless to say my iTunes installation on the PC is long gone and these iDevices are the proverbial bottomless bit - you’ll never get anything back out of them, or do you?
Thanks to the libimobiledevice and ifuse projects, you can now mount an ipod like any normal usb stick in linux, and it doesn’t even have to be a rooted device.
Tag: Ipod
How to mount an iPod in linux
I’m currently trying to get rid of my iPod and move all my music to my android phone. Needless to say my iTunes installation on the PC is long gone and these iDevices are the proverbial bottomless bit - you’ll never get anything back out of them, or do you?
Thanks to the libimobiledevice and ifuse projects, you can now mount an ipod like any normal usb stick in linux, and it doesn’t even have to be a rooted device.
Tag: Mount
How to mount an iPod in linux
I’m currently trying to get rid of my iPod and move all my music to my android phone. Needless to say my iTunes installation on the PC is long gone and these iDevices are the proverbial bottomless bit - you’ll never get anything back out of them, or do you?
Thanks to the libimobiledevice and ifuse projects, you can now mount an ipod like any normal usb stick in linux, and it doesn’t even have to be a rooted device.
Tag: Phone
How to mount an iPod in linux
I’m currently trying to get rid of my iPod and move all my music to my android phone. Needless to say my iTunes installation on the PC is long gone and these iDevices are the proverbial bottomless bit - you’ll never get anything back out of them, or do you?
Thanks to the libimobiledevice and ifuse projects, you can now mount an ipod like any normal usb stick in linux, and it doesn’t even have to be a rooted device.
Tag: Admin
First steps on a new postgresql installation
I’ve found myself trying postgres over mysql for a few things and had some trouble accessing the db server at all at first. So here’s how you first access the server:
$ su - # su to root
# su - postgres # su to postgres
$ psql # access the server
That wasn’t so hard now was it? Once I’m in the psql shell, I tend to issue the following SQL statement so my primary user has unlimited access, to make things easier for development:
Tag: Create
First steps on a new postgresql installation
I’ve found myself trying postgres over mysql for a few things and had some trouble accessing the db server at all at first. So here’s how you first access the server:
$ su - # su to root
# su - postgres # su to postgres
$ psql # access the server
That wasn’t so hard now was it? Once I’m in the psql shell, I tend to issue the following SQL statement so my primary user has unlimited access, to make things easier for development:
Tag: Psql
First steps on a new postgresql installation
I’ve found myself trying postgres over mysql for a few things and had some trouble accessing the db server at all at first. So here’s how you first access the server:
$ su - # su to root
# su - postgres # su to postgres
$ psql # access the server
That wasn’t so hard now was it? Once I’m in the psql shell, I tend to issue the following SQL statement so my primary user has unlimited access, to make things easier for development:
Tag: Superuser
First steps on a new postgresql installation
I’ve found myself trying postgres over mysql for a few things and had some trouble accessing the db server at all at first. So here’s how you first access the server:
$ su - # su to root
# su - postgres # su to postgres
$ psql # access the server
That wasn’t so hard now was it? Once I’m in the psql shell, I tend to issue the following SQL statement so my primary user has unlimited access, to make things easier for development:
Tag: User
First steps on a new postgresql installation
I’ve found myself trying postgres over mysql for a few things and had some trouble accessing the db server at all at first. So here’s how you first access the server:
$ su - # su to root
# su - postgres # su to postgres
$ psql # access the server
That wasn’t so hard now was it? Once I’m in the psql shell, I tend to issue the following SQL statement so my primary user has unlimited access, to make things easier for development:
Tag: Apply
Always test your Puppet Modules
Always test your puppet modules. Even if it’s just a little smoke test like this:
class { my\_module: }
Now to make this one better, create a job on your continuous integration server and let it run the following script inside the modules directory of your puppet project:
#!/bin/bash
#DEBUG="--verbose --debug"
hash puppet 2>/dev/null || { echo >&2 "Please install puppet"; exit 2; }
[ -z "$(facter | grep fqdn)" ] && { echo >&2 "Your machine has no FQDN (according to facter), some tests may fail or print warnings"; sleep 5; }
for dir in $(find . -type d -name tests); do
for file in $(find ${dir} -name '*.pp'); do
echo ">>> TESTING ${file}"
puppet apply ${DEBUG} --modulepath modules --noop "${file}" || { echo ">>> ERROR" ; HAS_FAILURES="true" ; }
echo "------------------------------------------------------------------------------"
done
done
if [ "${HAS_FAILURES}" = "true" ]; then
exit 1
fi
It looks for all ‘*.pp’ files inside all ’tests’ directories and does a simple ‘puppet apply’ call. It may not be perfect but it’s small, simple, it works and it’ll save you some gray hairs.
Tag: Ci
Always test your Puppet Modules
Always test your puppet modules. Even if it’s just a little smoke test like this:
class { my\_module: }
Now to make this one better, create a job on your continuous integration server and let it run the following script inside the modules directory of your puppet project:
#!/bin/bash
#DEBUG="--verbose --debug"
hash puppet 2>/dev/null || { echo >&2 "Please install puppet"; exit 2; }
[ -z "$(facter | grep fqdn)" ] && { echo >&2 "Your machine has no FQDN (according to facter), some tests may fail or print warnings"; sleep 5; }
for dir in $(find . -type d -name tests); do
for file in $(find ${dir} -name '*.pp'); do
echo ">>> TESTING ${file}"
puppet apply ${DEBUG} --modulepath modules --noop "${file}" || { echo ">>> ERROR" ; HAS_FAILURES="true" ; }
echo "------------------------------------------------------------------------------"
done
done
if [ "${HAS_FAILURES}" = "true" ]; then
exit 1
fi
It looks for all ‘*.pp’ files inside all ’tests’ directories and does a simple ‘puppet apply’ call. It may not be perfect but it’s small, simple, it works and it’ll save you some gray hairs.
Tag: Continuous
Always test your Puppet Modules
Always test your puppet modules. Even if it’s just a little smoke test like this:
class { my\_module: }
Now to make this one better, create a job on your continuous integration server and let it run the following script inside the modules directory of your puppet project:
#!/bin/bash
#DEBUG="--verbose --debug"
hash puppet 2>/dev/null || { echo >&2 "Please install puppet"; exit 2; }
[ -z "$(facter | grep fqdn)" ] && { echo >&2 "Your machine has no FQDN (according to facter), some tests may fail or print warnings"; sleep 5; }
for dir in $(find . -type d -name tests); do
for file in $(find ${dir} -name '*.pp'); do
echo ">>> TESTING ${file}"
puppet apply ${DEBUG} --modulepath modules --noop "${file}" || { echo ">>> ERROR" ; HAS_FAILURES="true" ; }
echo "------------------------------------------------------------------------------"
done
done
if [ "${HAS_FAILURES}" = "true" ]; then
exit 1
fi
It looks for all ‘*.pp’ files inside all ’tests’ directories and does a simple ‘puppet apply’ call. It may not be perfect but it’s small, simple, it works and it’ll save you some gray hairs.
Tag: Init.pp
Always test your Puppet Modules
Always test your puppet modules. Even if it’s just a little smoke test like this:
class { my\_module: }
Now to make this one better, create a job on your continuous integration server and let it run the following script inside the modules directory of your puppet project:
#!/bin/bash
#DEBUG="--verbose --debug"
hash puppet 2>/dev/null || { echo >&2 "Please install puppet"; exit 2; }
[ -z "$(facter | grep fqdn)" ] && { echo >&2 "Your machine has no FQDN (according to facter), some tests may fail or print warnings"; sleep 5; }
for dir in $(find . -type d -name tests); do
for file in $(find ${dir} -name '*.pp'); do
echo ">>> TESTING ${file}"
puppet apply ${DEBUG} --modulepath modules --noop "${file}" || { echo ">>> ERROR" ; HAS_FAILURES="true" ; }
echo "------------------------------------------------------------------------------"
done
done
if [ "${HAS_FAILURES}" = "true" ]; then
exit 1
fi
It looks for all ‘*.pp’ files inside all ’tests’ directories and does a simple ‘puppet apply’ call. It may not be perfect but it’s small, simple, it works and it’ll save you some gray hairs.
Tag: Integration
Always test your Puppet Modules
Always test your puppet modules. Even if it’s just a little smoke test like this:
class { my\_module: }
Now to make this one better, create a job on your continuous integration server and let it run the following script inside the modules directory of your puppet project:
#!/bin/bash
#DEBUG="--verbose --debug"
hash puppet 2>/dev/null || { echo >&2 "Please install puppet"; exit 2; }
[ -z "$(facter | grep fqdn)" ] && { echo >&2 "Your machine has no FQDN (according to facter), some tests may fail or print warnings"; sleep 5; }
for dir in $(find . -type d -name tests); do
for file in $(find ${dir} -name '*.pp'); do
echo ">>> TESTING ${file}"
puppet apply ${DEBUG} --modulepath modules --noop "${file}" || { echo ">>> ERROR" ; HAS_FAILURES="true" ; }
echo "------------------------------------------------------------------------------"
done
done
if [ "${HAS_FAILURES}" = "true" ]; then
exit 1
fi
It looks for all ‘*.pp’ files inside all ’tests’ directories and does a simple ‘puppet apply’ call. It may not be perfect but it’s small, simple, it works and it’ll save you some gray hairs.
Tag: Module
Always test your Puppet Modules
Always test your puppet modules. Even if it’s just a little smoke test like this:
class { my\_module: }
Now to make this one better, create a job on your continuous integration server and let it run the following script inside the modules directory of your puppet project:
#!/bin/bash
#DEBUG="--verbose --debug"
hash puppet 2>/dev/null || { echo >&2 "Please install puppet"; exit 2; }
[ -z "$(facter | grep fqdn)" ] && { echo >&2 "Your machine has no FQDN (according to facter), some tests may fail or print warnings"; sleep 5; }
for dir in $(find . -type d -name tests); do
for file in $(find ${dir} -name '*.pp'); do
echo ">>> TESTING ${file}"
puppet apply ${DEBUG} --modulepath modules --noop "${file}" || { echo ">>> ERROR" ; HAS_FAILURES="true" ; }
echo "------------------------------------------------------------------------------"
done
done
if [ "${HAS_FAILURES}" = "true" ]; then
exit 1
fi
It looks for all ‘*.pp’ files inside all ’tests’ directories and does a simple ‘puppet apply’ call. It may not be perfect but it’s small, simple, it works and it’ll save you some gray hairs.
Tag: Pp
Always test your Puppet Modules
Always test your puppet modules. Even if it’s just a little smoke test like this:
class { my\_module: }
Now to make this one better, create a job on your continuous integration server and let it run the following script inside the modules directory of your puppet project:
#!/bin/bash
#DEBUG="--verbose --debug"
hash puppet 2>/dev/null || { echo >&2 "Please install puppet"; exit 2; }
[ -z "$(facter | grep fqdn)" ] && { echo >&2 "Your machine has no FQDN (according to facter), some tests may fail or print warnings"; sleep 5; }
for dir in $(find . -type d -name tests); do
for file in $(find ${dir} -name '*.pp'); do
echo ">>> TESTING ${file}"
puppet apply ${DEBUG} --modulepath modules --noop "${file}" || { echo ">>> ERROR" ; HAS_FAILURES="true" ; }
echo "------------------------------------------------------------------------------"
done
done
if [ "${HAS_FAILURES}" = "true" ]; then
exit 1
fi
It looks for all ‘*.pp’ files inside all ’tests’ directories and does a simple ‘puppet apply’ call. It may not be perfect but it’s small, simple, it works and it’ll save you some gray hairs.
Tag: Puppet
Always test your Puppet Modules
Always test your puppet modules. Even if it’s just a little smoke test like this:
class { my\_module: }
Now to make this one better, create a job on your continuous integration server and let it run the following script inside the modules directory of your puppet project:
#!/bin/bash
#DEBUG="--verbose --debug"
hash puppet 2>/dev/null || { echo >&2 "Please install puppet"; exit 2; }
[ -z "$(facter | grep fqdn)" ] && { echo >&2 "Your machine has no FQDN (according to facter), some tests may fail or print warnings"; sleep 5; }
for dir in $(find . -type d -name tests); do
for file in $(find ${dir} -name '*.pp'); do
echo ">>> TESTING ${file}"
puppet apply ${DEBUG} --modulepath modules --noop "${file}" || { echo ">>> ERROR" ; HAS_FAILURES="true" ; }
echo "------------------------------------------------------------------------------"
done
done
if [ "${HAS_FAILURES}" = "true" ]; then
exit 1
fi
It looks for all ‘*.pp’ files inside all ’tests’ directories and does a simple ‘puppet apply’ call. It may not be perfect but it’s small, simple, it works and it’ll save you some gray hairs.
Tag: Test
Always test your Puppet Modules
Always test your puppet modules. Even if it’s just a little smoke test like this:
class { my\_module: }
Now to make this one better, create a job on your continuous integration server and let it run the following script inside the modules directory of your puppet project:
#!/bin/bash
#DEBUG="--verbose --debug"
hash puppet 2>/dev/null || { echo >&2 "Please install puppet"; exit 2; }
[ -z "$(facter | grep fqdn)" ] && { echo >&2 "Your machine has no FQDN (according to facter), some tests may fail or print warnings"; sleep 5; }
for dir in $(find . -type d -name tests); do
for file in $(find ${dir} -name '*.pp'); do
echo ">>> TESTING ${file}"
puppet apply ${DEBUG} --modulepath modules --noop "${file}" || { echo ">>> ERROR" ; HAS_FAILURES="true" ; }
echo "------------------------------------------------------------------------------"
done
done
if [ "${HAS_FAILURES}" = "true" ]; then
exit 1
fi
It looks for all ‘*.pp’ files inside all ’tests’ directories and does a simple ‘puppet apply’ call. It may not be perfect but it’s small, simple, it works and it’ll save you some gray hairs.
Tag: Testing
Always test your Puppet Modules
Always test your puppet modules. Even if it’s just a little smoke test like this:
class { my\_module: }
Now to make this one better, create a job on your continuous integration server and let it run the following script inside the modules directory of your puppet project:
#!/bin/bash
#DEBUG="--verbose --debug"
hash puppet 2>/dev/null || { echo >&2 "Please install puppet"; exit 2; }
[ -z "$(facter | grep fqdn)" ] && { echo >&2 "Your machine has no FQDN (according to facter), some tests may fail or print warnings"; sleep 5; }
for dir in $(find . -type d -name tests); do
for file in $(find ${dir} -name '*.pp'); do
echo ">>> TESTING ${file}"
puppet apply ${DEBUG} --modulepath modules --noop "${file}" || { echo ">>> ERROR" ; HAS_FAILURES="true" ; }
echo "------------------------------------------------------------------------------"
done
done
if [ "${HAS_FAILURES}" = "true" ]; then
exit 1
fi
It looks for all ‘*.pp’ files inside all ’tests’ directories and does a simple ‘puppet apply’ call. It may not be perfect but it’s small, simple, it works and it’ll save you some gray hairs.
Tag: Disk
Resizing VirtualBox Drives
My Virtualbox images are usually fairly small. Here’s how to increase the size of a vbox disk, to 8G in this case:
vboxmanage modifyhd /home/user/vbox/vm/vm.vdi --resize 8192
After that, remove the disk from the vm it’s currently attached to, hook it to a new vm, which will boot from an iso file. Once booted, start gparted. It will complain that the gpt partition table is not (anymore) at the end of the disk, ignore that message and you’ll see the full disk with the initial partition plus additional space at the end. Resize the partition on the disk or create new partitions, whatever you like. Save the changes, stop the vm, move the drive back to the initial vm and start that one again.
Tag: Drive
Resizing VirtualBox Drives
My Virtualbox images are usually fairly small. Here’s how to increase the size of a vbox disk, to 8G in this case:
vboxmanage modifyhd /home/user/vbox/vm/vm.vdi --resize 8192
After that, remove the disk from the vm it’s currently attached to, hook it to a new vm, which will boot from an iso file. Once booted, start gparted. It will complain that the gpt partition table is not (anymore) at the end of the disk, ignore that message and you’ll see the full disk with the initial partition plus additional space at the end. Resize the partition on the disk or create new partitions, whatever you like. Save the changes, stop the vm, move the drive back to the initial vm and start that one again.
Tag: Modifahd
Resizing VirtualBox Drives
My Virtualbox images are usually fairly small. Here’s how to increase the size of a vbox disk, to 8G in this case:
vboxmanage modifyhd /home/user/vbox/vm/vm.vdi --resize 8192
After that, remove the disk from the vm it’s currently attached to, hook it to a new vm, which will boot from an iso file. Once booted, start gparted. It will complain that the gpt partition table is not (anymore) at the end of the disk, ignore that message and you’ll see the full disk with the initial partition plus additional space at the end. Resize the partition on the disk or create new partitions, whatever you like. Save the changes, stop the vm, move the drive back to the initial vm and start that one again.
Tag: Partition
Resizing VirtualBox Drives
My Virtualbox images are usually fairly small. Here’s how to increase the size of a vbox disk, to 8G in this case:
vboxmanage modifyhd /home/user/vbox/vm/vm.vdi --resize 8192
After that, remove the disk from the vm it’s currently attached to, hook it to a new vm, which will boot from an iso file. Once booted, start gparted. It will complain that the gpt partition table is not (anymore) at the end of the disk, ignore that message and you’ll see the full disk with the initial partition plus additional space at the end. Resize the partition on the disk or create new partitions, whatever you like. Save the changes, stop the vm, move the drive back to the initial vm and start that one again.
Tag: Resize
Resizing VirtualBox Drives
My Virtualbox images are usually fairly small. Here’s how to increase the size of a vbox disk, to 8G in this case:
vboxmanage modifyhd /home/user/vbox/vm/vm.vdi --resize 8192
After that, remove the disk from the vm it’s currently attached to, hook it to a new vm, which will boot from an iso file. Once booted, start gparted. It will complain that the gpt partition table is not (anymore) at the end of the disk, ignore that message and you’ll see the full disk with the initial partition plus additional space at the end. Resize the partition on the disk or create new partitions, whatever you like. Save the changes, stop the vm, move the drive back to the initial vm and start that one again.
Tag: Virtualbox
Resizing VirtualBox Drives
My Virtualbox images are usually fairly small. Here’s how to increase the size of a vbox disk, to 8G in this case:
vboxmanage modifyhd /home/user/vbox/vm/vm.vdi --resize 8192
After that, remove the disk from the vm it’s currently attached to, hook it to a new vm, which will boot from an iso file. Once booted, start gparted. It will complain that the gpt partition table is not (anymore) at the end of the disk, ignore that message and you’ll see the full disk with the initial partition plus additional space at the end. Resize the partition on the disk or create new partitions, whatever you like. Save the changes, stop the vm, move the drive back to the initial vm and start that one again.
Tag: Awk
Using awk to remove lines from a file
A little goodie I’m writing up here to document for my own future reference.
Objective: Remove all lines matching a certrain pattern from all files in a directory structure. The pattern is in the form of “part1.
First try: grep and sed, unfortunately that leaves me with blank lines where the matching content was and I want that line removed completely. Apparently sed can’t do that.
Second try: use awk with regexes. It took me a while and I wanted to use ’next’ on matching lines but that didn’t seem to work so I ended up printing the entire line if the regex didn’t match. Here goes:
Tag: Sed
Using awk to remove lines from a file
A little goodie I’m writing up here to document for my own future reference.
Objective: Remove all lines matching a certrain pattern from all files in a directory structure. The pattern is in the form of “part1.
First try: grep and sed, unfortunately that leaves me with blank lines where the matching content was and I want that line removed completely. Apparently sed can’t do that.
Second try: use awk with regexes. It took me a while and I wanted to use ’next’ on matching lines but that didn’t seem to work so I ended up printing the entire line if the regex didn’t match. Here goes:
Tag: Curl
Simple python script to access jmx via jolokia
I’ve never been much of a python guy. Now there’s this production environment where we’ve got a few basic things like bash, ruby (since we’re using puppet) and python. Our Java based software has a nice JMX interface with a bunch of lifesavers in it and each JVM has the jolokia agent running (http/json access to jmx) so we can access this goodness fairly easily from scripts and/or the shell. So far what we’d be doing would be something like this:
Tag: Json.tool
Simple python script to access jmx via jolokia
I’ve never been much of a python guy. Now there’s this production environment where we’ve got a few basic things like bash, ruby (since we’re using puppet) and python. Our Java based software has a nice JMX interface with a bunch of lifesavers in it and each JVM has the jolokia agent running (http/json access to jmx) so we can access this goodness fairly easily from scripts and/or the shell. So far what we’d be doing would be something like this:
Tag: Jvm
Simple python script to access jmx via jolokia
I’ve never been much of a python guy. Now there’s this production environment where we’ve got a few basic things like bash, ruby (since we’re using puppet) and python. Our Java based software has a nice JMX interface with a bunch of lifesavers in it and each JVM has the jolokia agent running (http/json access to jmx) so we can access this goodness fairly easily from scripts and/or the shell. So far what we’d be doing would be something like this:
Tag: Mbean
Simple python script to access jmx via jolokia
I’ve never been much of a python guy. Now there’s this production environment where we’ve got a few basic things like bash, ruby (since we’re using puppet) and python. Our Java based software has a nice JMX interface with a bunch of lifesavers in it and each JVM has the jolokia agent running (http/json access to jmx) so we can access this goodness fairly easily from scripts and/or the shell. So far what we’d be doing would be something like this:
Tag: Alignment
Pretty Capistrano output
I recently started tinkering around with capistrano to automate a few things like doing checks and generally make my life easier in terms of managing a bunch of linux machines. While capistrano is really cool, it’s output is all over the place and I find that very irritating. Maybe it’s just me but I don’t like non-aligned repeating Strings. Take this output for example:
\*\* \[out :: borked.swisstech.net\] puppetd (pid 14395) is running...
\*\* \[out :: carmine.swisstech.net\] puppetd (pid 14433) is running...
\*\* \[out :: mandelbrot.swisstech.net\] puppetd (pid 2835) is running...
\*\* \[out :: magi.swisstech.net\] puppetd (pid 28830) is running...
\*\* \[out :: enchilada.swisstech.net\] puppetd (pid 4455) is running...
\*\* \[out :: titan.swisstech.net\] puppetd (pid 30098) is running...
\*\* \[out :: kronos.swisstech.net\] puppetd (pid 27332) is running...
\*\* \[out :: serenity.swisstech.net\] puppetd (pid 11072) is running...
\*\* \[out :: ackbar.swisstech.net\] puppetd (pid 17522) is running...
\*\* \[out :: r2d2.swisstech.net\] puppetd (pid 15535) is running...
\*\* \[out :: seraph.swisstech.net\] puppetd (pid 24193) is running...
\*\* \[out :: box.swisstech.net\] puppetd (pid 23061) is running...
\*\* \[out :: fermi.swisstech.net\] puppetd (pid 8380) is running...
\*\* \[out :: gossamer.swisstech.net\] puppetd (pid 28875) is running...
\*\* \[out :: marvin.swisstech.net\] puppetd (pid 17977) is running...
The above is one of the easier to read examples. There are many more that are much worse to read than that (but too big to paste into the blog here).
Tag: Capfile
Pretty Capistrano output
I recently started tinkering around with capistrano to automate a few things like doing checks and generally make my life easier in terms of managing a bunch of linux machines. While capistrano is really cool, it’s output is all over the place and I find that very irritating. Maybe it’s just me but I don’t like non-aligned repeating Strings. Take this output for example:
\*\* \[out :: borked.swisstech.net\] puppetd (pid 14395) is running...
\*\* \[out :: carmine.swisstech.net\] puppetd (pid 14433) is running...
\*\* \[out :: mandelbrot.swisstech.net\] puppetd (pid 2835) is running...
\*\* \[out :: magi.swisstech.net\] puppetd (pid 28830) is running...
\*\* \[out :: enchilada.swisstech.net\] puppetd (pid 4455) is running...
\*\* \[out :: titan.swisstech.net\] puppetd (pid 30098) is running...
\*\* \[out :: kronos.swisstech.net\] puppetd (pid 27332) is running...
\*\* \[out :: serenity.swisstech.net\] puppetd (pid 11072) is running...
\*\* \[out :: ackbar.swisstech.net\] puppetd (pid 17522) is running...
\*\* \[out :: r2d2.swisstech.net\] puppetd (pid 15535) is running...
\*\* \[out :: seraph.swisstech.net\] puppetd (pid 24193) is running...
\*\* \[out :: box.swisstech.net\] puppetd (pid 23061) is running...
\*\* \[out :: fermi.swisstech.net\] puppetd (pid 8380) is running...
\*\* \[out :: gossamer.swisstech.net\] puppetd (pid 28875) is running...
\*\* \[out :: marvin.swisstech.net\] puppetd (pid 17977) is running...
The above is one of the easier to read examples. There are many more that are much worse to read than that (but too big to paste into the blog here).
Tag: Capistrano
Pretty Capistrano output
I recently started tinkering around with capistrano to automate a few things like doing checks and generally make my life easier in terms of managing a bunch of linux machines. While capistrano is really cool, it’s output is all over the place and I find that very irritating. Maybe it’s just me but I don’t like non-aligned repeating Strings. Take this output for example:
\*\* \[out :: borked.swisstech.net\] puppetd (pid 14395) is running...
\*\* \[out :: carmine.swisstech.net\] puppetd (pid 14433) is running...
\*\* \[out :: mandelbrot.swisstech.net\] puppetd (pid 2835) is running...
\*\* \[out :: magi.swisstech.net\] puppetd (pid 28830) is running...
\*\* \[out :: enchilada.swisstech.net\] puppetd (pid 4455) is running...
\*\* \[out :: titan.swisstech.net\] puppetd (pid 30098) is running...
\*\* \[out :: kronos.swisstech.net\] puppetd (pid 27332) is running...
\*\* \[out :: serenity.swisstech.net\] puppetd (pid 11072) is running...
\*\* \[out :: ackbar.swisstech.net\] puppetd (pid 17522) is running...
\*\* \[out :: r2d2.swisstech.net\] puppetd (pid 15535) is running...
\*\* \[out :: seraph.swisstech.net\] puppetd (pid 24193) is running...
\*\* \[out :: box.swisstech.net\] puppetd (pid 23061) is running...
\*\* \[out :: fermi.swisstech.net\] puppetd (pid 8380) is running...
\*\* \[out :: gossamer.swisstech.net\] puppetd (pid 28875) is running...
\*\* \[out :: marvin.swisstech.net\] puppetd (pid 17977) is running...
The above is one of the easier to read examples. There are many more that are much worse to read than that (but too big to paste into the blog here).
Tag: Formatting
Pretty Capistrano output
I recently started tinkering around with capistrano to automate a few things like doing checks and generally make my life easier in terms of managing a bunch of linux machines. While capistrano is really cool, it’s output is all over the place and I find that very irritating. Maybe it’s just me but I don’t like non-aligned repeating Strings. Take this output for example:
\*\* \[out :: borked.swisstech.net\] puppetd (pid 14395) is running...
\*\* \[out :: carmine.swisstech.net\] puppetd (pid 14433) is running...
\*\* \[out :: mandelbrot.swisstech.net\] puppetd (pid 2835) is running...
\*\* \[out :: magi.swisstech.net\] puppetd (pid 28830) is running...
\*\* \[out :: enchilada.swisstech.net\] puppetd (pid 4455) is running...
\*\* \[out :: titan.swisstech.net\] puppetd (pid 30098) is running...
\*\* \[out :: kronos.swisstech.net\] puppetd (pid 27332) is running...
\*\* \[out :: serenity.swisstech.net\] puppetd (pid 11072) is running...
\*\* \[out :: ackbar.swisstech.net\] puppetd (pid 17522) is running...
\*\* \[out :: r2d2.swisstech.net\] puppetd (pid 15535) is running...
\*\* \[out :: seraph.swisstech.net\] puppetd (pid 24193) is running...
\*\* \[out :: box.swisstech.net\] puppetd (pid 23061) is running...
\*\* \[out :: fermi.swisstech.net\] puppetd (pid 8380) is running...
\*\* \[out :: gossamer.swisstech.net\] puppetd (pid 28875) is running...
\*\* \[out :: marvin.swisstech.net\] puppetd (pid 17977) is running...
The above is one of the easier to read examples. There are many more that are much worse to read than that (but too big to paste into the blog here).
Tag: Output
Pretty Capistrano output
I recently started tinkering around with capistrano to automate a few things like doing checks and generally make my life easier in terms of managing a bunch of linux machines. While capistrano is really cool, it’s output is all over the place and I find that very irritating. Maybe it’s just me but I don’t like non-aligned repeating Strings. Take this output for example:
\*\* \[out :: borked.swisstech.net\] puppetd (pid 14395) is running...
\*\* \[out :: carmine.swisstech.net\] puppetd (pid 14433) is running...
\*\* \[out :: mandelbrot.swisstech.net\] puppetd (pid 2835) is running...
\*\* \[out :: magi.swisstech.net\] puppetd (pid 28830) is running...
\*\* \[out :: enchilada.swisstech.net\] puppetd (pid 4455) is running...
\*\* \[out :: titan.swisstech.net\] puppetd (pid 30098) is running...
\*\* \[out :: kronos.swisstech.net\] puppetd (pid 27332) is running...
\*\* \[out :: serenity.swisstech.net\] puppetd (pid 11072) is running...
\*\* \[out :: ackbar.swisstech.net\] puppetd (pid 17522) is running...
\*\* \[out :: r2d2.swisstech.net\] puppetd (pid 15535) is running...
\*\* \[out :: seraph.swisstech.net\] puppetd (pid 24193) is running...
\*\* \[out :: box.swisstech.net\] puppetd (pid 23061) is running...
\*\* \[out :: fermi.swisstech.net\] puppetd (pid 8380) is running...
\*\* \[out :: gossamer.swisstech.net\] puppetd (pid 28875) is running...
\*\* \[out :: marvin.swisstech.net\] puppetd (pid 17977) is running...
The above is one of the easier to read examples. There are many more that are much worse to read than that (but too big to paste into the blog here).
Tag: Ruby
Pretty Capistrano output
I recently started tinkering around with capistrano to automate a few things like doing checks and generally make my life easier in terms of managing a bunch of linux machines. While capistrano is really cool, it’s output is all over the place and I find that very irritating. Maybe it’s just me but I don’t like non-aligned repeating Strings. Take this output for example:
\*\* \[out :: borked.swisstech.net\] puppetd (pid 14395) is running...
\*\* \[out :: carmine.swisstech.net\] puppetd (pid 14433) is running...
\*\* \[out :: mandelbrot.swisstech.net\] puppetd (pid 2835) is running...
\*\* \[out :: magi.swisstech.net\] puppetd (pid 28830) is running...
\*\* \[out :: enchilada.swisstech.net\] puppetd (pid 4455) is running...
\*\* \[out :: titan.swisstech.net\] puppetd (pid 30098) is running...
\*\* \[out :: kronos.swisstech.net\] puppetd (pid 27332) is running...
\*\* \[out :: serenity.swisstech.net\] puppetd (pid 11072) is running...
\*\* \[out :: ackbar.swisstech.net\] puppetd (pid 17522) is running...
\*\* \[out :: r2d2.swisstech.net\] puppetd (pid 15535) is running...
\*\* \[out :: seraph.swisstech.net\] puppetd (pid 24193) is running...
\*\* \[out :: box.swisstech.net\] puppetd (pid 23061) is running...
\*\* \[out :: fermi.swisstech.net\] puppetd (pid 8380) is running...
\*\* \[out :: gossamer.swisstech.net\] puppetd (pid 28875) is running...
\*\* \[out :: marvin.swisstech.net\] puppetd (pid 17977) is running...
The above is one of the easier to read examples. There are many more that are much worse to read than that (but too big to paste into the blog here).
Tag: Run
Pretty Capistrano output
I recently started tinkering around with capistrano to automate a few things like doing checks and generally make my life easier in terms of managing a bunch of linux machines. While capistrano is really cool, it’s output is all over the place and I find that very irritating. Maybe it’s just me but I don’t like non-aligned repeating Strings. Take this output for example:
\*\* \[out :: borked.swisstech.net\] puppetd (pid 14395) is running...
\*\* \[out :: carmine.swisstech.net\] puppetd (pid 14433) is running...
\*\* \[out :: mandelbrot.swisstech.net\] puppetd (pid 2835) is running...
\*\* \[out :: magi.swisstech.net\] puppetd (pid 28830) is running...
\*\* \[out :: enchilada.swisstech.net\] puppetd (pid 4455) is running...
\*\* \[out :: titan.swisstech.net\] puppetd (pid 30098) is running...
\*\* \[out :: kronos.swisstech.net\] puppetd (pid 27332) is running...
\*\* \[out :: serenity.swisstech.net\] puppetd (pid 11072) is running...
\*\* \[out :: ackbar.swisstech.net\] puppetd (pid 17522) is running...
\*\* \[out :: r2d2.swisstech.net\] puppetd (pid 15535) is running...
\*\* \[out :: seraph.swisstech.net\] puppetd (pid 24193) is running...
\*\* \[out :: box.swisstech.net\] puppetd (pid 23061) is running...
\*\* \[out :: fermi.swisstech.net\] puppetd (pid 8380) is running...
\*\* \[out :: gossamer.swisstech.net\] puppetd (pid 28875) is running...
\*\* \[out :: marvin.swisstech.net\] puppetd (pid 17977) is running...
The above is one of the easier to read examples. There are many more that are much worse to read than that (but too big to paste into the blog here).
Tag: Ssh
Pretty Capistrano output
I recently started tinkering around with capistrano to automate a few things like doing checks and generally make my life easier in terms of managing a bunch of linux machines. While capistrano is really cool, it’s output is all over the place and I find that very irritating. Maybe it’s just me but I don’t like non-aligned repeating Strings. Take this output for example:
\*\* \[out :: borked.swisstech.net\] puppetd (pid 14395) is running...
\*\* \[out :: carmine.swisstech.net\] puppetd (pid 14433) is running...
\*\* \[out :: mandelbrot.swisstech.net\] puppetd (pid 2835) is running...
\*\* \[out :: magi.swisstech.net\] puppetd (pid 28830) is running...
\*\* \[out :: enchilada.swisstech.net\] puppetd (pid 4455) is running...
\*\* \[out :: titan.swisstech.net\] puppetd (pid 30098) is running...
\*\* \[out :: kronos.swisstech.net\] puppetd (pid 27332) is running...
\*\* \[out :: serenity.swisstech.net\] puppetd (pid 11072) is running...
\*\* \[out :: ackbar.swisstech.net\] puppetd (pid 17522) is running...
\*\* \[out :: r2d2.swisstech.net\] puppetd (pid 15535) is running...
\*\* \[out :: seraph.swisstech.net\] puppetd (pid 24193) is running...
\*\* \[out :: box.swisstech.net\] puppetd (pid 23061) is running...
\*\* \[out :: fermi.swisstech.net\] puppetd (pid 8380) is running...
\*\* \[out :: gossamer.swisstech.net\] puppetd (pid 28875) is running...
\*\* \[out :: marvin.swisstech.net\] puppetd (pid 17977) is running...
The above is one of the easier to read examples. There are many more that are much worse to read than that (but too big to paste into the blog here).
The simplest backup solution ever...
… not for everyone maybe, but for me as a linux/java/groovy person it is!
you’ll need
- java
- groovy
- rsync
- ssh
- cron
I want to regularly copy files from a bunch of paths from several remote hosts to the localhost. I know nothing about arrays etc in bash to configure this stuff in a simple way but the solution I’ve come up with is simple and elegant:
I wrote a little groovy script that generates the shell commands to be executed. the output of the groovy script is piped into bash, which executes the commands. That call of the groovy script with the piping to the bash is in a little shell script which is called from cron.
Tag: Image
Quality loss when saving as JPEG
So recently an article was linked on reddit about a bunch of photography myths. Number 7, called “Saving a JPEG multiple times degrades quality” which was apparently proven to be wrong disturbed me most and so I did a quick test on my own.
Further below are Three images. The Original, the one saved 30 times and an XOR of the first Two showing the differences. If you toggle between the first two you can see quite a difference in the color on the car and several other details. Running any more tests at lower than 100% quality is pretty much a waste of time once you see these results.
Tag: Imagemagick
Quality loss when saving as JPEG
So recently an article was linked on reddit about a bunch of photography myths. Number 7, called “Saving a JPEG multiple times degrades quality” which was apparently proven to be wrong disturbed me most and so I did a quick test on my own.
Further below are Three images. The Original, the one saved 30 times and an XOR of the first Two showing the differences. If you toggle between the first two you can see quite a difference in the color on the car and several other details. Running any more tests at lower than 100% quality is pretty much a waste of time once you see these results.
The simplest way to create pdf's from images
There’s probably no simpler way to create a pdf from images. Behold:
convert \*.jpg output.pdf
That’s all folks!
The simplest way to create pdf's from images
There’s probably no simpler way to create a pdf from images. Behold:
convert \*.jpg output.pdf
That’s all folks!
Using StarCluster for some heavy computing
Update: 2012-10-22
I recently put the scripts I used in a little github repo called starcluster-image-sharpener. It’s nothing big but it’ll get you started if needed. Go get them! ;)
Introduction
First a little context:
I’ve photographed a ton recently and on one occasion I screwed things up a bit and had a lot of slightly out of focus images. I’ve tinkered with the Canon Digital Photo Professional settings to find a good level of sharpening (after noise reduction because a lot was shot at high iso (for me and my 50D, iso 1250 is a lot already)) but wasn’t happy with it. I’ve found reasonable settings for noise reduction, which makes the image a bit softer, but sharpening wouldn’t do anything to compensate let alone actually sharpen.
Using StarCluster for some heavy computing
Update: 2012-10-22
I recently put the scripts I used in a little github repo called starcluster-image-sharpener. It’s nothing big but it’ll get you started if needed. Go get them! ;)
Introduction
First a little context:
I’ve photographed a ton recently and on one occasion I screwed things up a bit and had a lot of slightly out of focus images. I’ve tinkered with the Canon Digital Photo Professional settings to find a good level of sharpening (after noise reduction because a lot was shot at high iso (for me and my 50D, iso 1250 is a lot already)) but wasn’t happy with it. I’ve found reasonable settings for noise reduction, which makes the image a bit softer, but sharpening wouldn’t do anything to compensate let alone actually sharpen.
Tag: Imagequality
Quality loss when saving as JPEG
So recently an article was linked on reddit about a bunch of photography myths. Number 7, called “Saving a JPEG multiple times degrades quality” which was apparently proven to be wrong disturbed me most and so I did a quick test on my own.
Further below are Three images. The Original, the one saved 30 times and an XOR of the first Two showing the differences. If you toggle between the first two you can see quite a difference in the color on the car and several other details. Running any more tests at lower than 100% quality is pretty much a waste of time once you see these results.
Tag: Jpeg
Quality loss when saving as JPEG
So recently an article was linked on reddit about a bunch of photography myths. Number 7, called “Saving a JPEG multiple times degrades quality” which was apparently proven to be wrong disturbed me most and so I did a quick test on my own.
Further below are Three images. The Original, the one saved 30 times and an XOR of the first Two showing the differences. If you toggle between the first two you can see quite a difference in the color on the car and several other details. Running any more tests at lower than 100% quality is pretty much a waste of time once you see these results.
Tag: Jpg
Quality loss when saving as JPEG
So recently an article was linked on reddit about a bunch of photography myths. Number 7, called “Saving a JPEG multiple times degrades quality” which was apparently proven to be wrong disturbed me most and so I did a quick test on my own.
Further below are Three images. The Original, the one saved 30 times and an XOR of the first Two showing the differences. If you toggle between the first two you can see quite a difference in the color on the car and several other details. Running any more tests at lower than 100% quality is pretty much a waste of time once you see these results.
Tag: Magick
Quality loss when saving as JPEG
So recently an article was linked on reddit about a bunch of photography myths. Number 7, called “Saving a JPEG multiple times degrades quality” which was apparently proven to be wrong disturbed me most and so I did a quick test on my own.
Further below are Three images. The Original, the one saved 30 times and an XOR of the first Two showing the differences. If you toggle between the first two you can see quite a difference in the color on the car and several other details. Running any more tests at lower than 100% quality is pretty much a waste of time once you see these results.
Tag: Myth
Quality loss when saving as JPEG
So recently an article was linked on reddit about a bunch of photography myths. Number 7, called “Saving a JPEG multiple times degrades quality” which was apparently proven to be wrong disturbed me most and so I did a quick test on my own.
Further below are Three images. The Original, the one saved 30 times and an XOR of the first Two showing the differences. If you toggle between the first two you can see quite a difference in the color on the car and several other details. Running any more tests at lower than 100% quality is pretty much a waste of time once you see these results.
Tag: Photography
Quality loss when saving as JPEG
So recently an article was linked on reddit about a bunch of photography myths. Number 7, called “Saving a JPEG multiple times degrades quality” which was apparently proven to be wrong disturbed me most and so I did a quick test on my own.
Further below are Three images. The Original, the one saved 30 times and an XOR of the first Two showing the differences. If you toggle between the first two you can see quite a difference in the color on the car and several other details. Running any more tests at lower than 100% quality is pretty much a waste of time once you see these results.
Building a wooden Lensboard for the Schneider-Kreuznach G-Claron 240/11 WA
I recently bought a nice little repro lens, the Schneider-Kreuznach G-Claron 240/11 WA. The Schneider-Kreuznach archive contains some more information on the G-Claron 240
So without a real plan, only a rough idea I went ahead and bought some materials to build myself a little lensboard. With a little wood, some glue and a few nails I quickly threw together that lensboard. Add some felt to make it completely light-tight and you’re done.
Building a wooden Lensboard for the Schneider-Kreuznach G-Claron 240/11 WA
I recently bought a nice little repro lens, the Schneider-Kreuznach G-Claron 240/11 WA. The Schneider-Kreuznach archive contains some more information on the G-Claron 240
So without a real plan, only a rough idea I went ahead and bought some materials to build myself a little lensboard. With a little wood, some glue and a few nails I quickly threw together that lensboard. Add some felt to make it completely light-tight and you’re done.
Tag: Qualityloss
Quality loss when saving as JPEG
So recently an article was linked on reddit about a bunch of photography myths. Number 7, called “Saving a JPEG multiple times degrades quality” which was apparently proven to be wrong disturbed me most and so I did a quick test on my own.
Further below are Three images. The Original, the one saved 30 times and an XOR of the first Two showing the differences. If you toggle between the first two you can see quite a difference in the color on the car and several other details. Running any more tests at lower than 100% quality is pretty much a waste of time once you see these results.
Tag: Save
Quality loss when saving as JPEG
So recently an article was linked on reddit about a bunch of photography myths. Number 7, called “Saving a JPEG multiple times degrades quality” which was apparently proven to be wrong disturbed me most and so I did a quick test on my own.
Further below are Three images. The Original, the one saved 30 times and an XOR of the first Two showing the differences. If you toggle between the first two you can see quite a difference in the color on the car and several other details. Running any more tests at lower than 100% quality is pretty much a waste of time once you see these results.
Tag: 4x5
Building a wooden Lensboard for the Schneider-Kreuznach G-Claron 240/11 WA
I recently bought a nice little repro lens, the Schneider-Kreuznach G-Claron 240/11 WA. The Schneider-Kreuznach archive contains some more information on the G-Claron 240
So without a real plan, only a rough idea I went ahead and bought some materials to build myself a little lensboard. With a little wood, some glue and a few nails I quickly threw together that lensboard. Add some felt to make it completely light-tight and you’re done.
Building a wooden Lensboard for the Schneider-Kreuznach G-Claron 240/11 WA
I recently bought a nice little repro lens, the Schneider-Kreuznach G-Claron 240/11 WA. The Schneider-Kreuznach archive contains some more information on the G-Claron 240
So without a real plan, only a rough idea I went ahead and bought some materials to build myself a little lensboard. With a little wood, some glue and a few nails I quickly threw together that lensboard. Add some felt to make it completely light-tight and you’re done.
Tag: Analog
Building a wooden Lensboard for the Schneider-Kreuznach G-Claron 240/11 WA
I recently bought a nice little repro lens, the Schneider-Kreuznach G-Claron 240/11 WA. The Schneider-Kreuznach archive contains some more information on the G-Claron 240
So without a real plan, only a rough idea I went ahead and bought some materials to build myself a little lensboard. With a little wood, some glue and a few nails I quickly threw together that lensboard. Add some felt to make it completely light-tight and you’re done.
Building a wooden Lensboard for the Schneider-Kreuznach G-Claron 240/11 WA
I recently bought a nice little repro lens, the Schneider-Kreuznach G-Claron 240/11 WA. The Schneider-Kreuznach archive contains some more information on the G-Claron 240
So without a real plan, only a rough idea I went ahead and bought some materials to build myself a little lensboard. With a little wood, some glue and a few nails I quickly threw together that lensboard. Add some felt to make it completely light-tight and you’re done.
Tag: Claron
Building a wooden Lensboard for the Schneider-Kreuznach G-Claron 240/11 WA
I recently bought a nice little repro lens, the Schneider-Kreuznach G-Claron 240/11 WA. The Schneider-Kreuznach archive contains some more information on the G-Claron 240
So without a real plan, only a rough idea I went ahead and bought some materials to build myself a little lensboard. With a little wood, some glue and a few nails I quickly threw together that lensboard. Add some felt to make it completely light-tight and you’re done.
Building a wooden Lensboard for the Schneider-Kreuznach G-Claron 240/11 WA
I recently bought a nice little repro lens, the Schneider-Kreuznach G-Claron 240/11 WA. The Schneider-Kreuznach archive contains some more information on the G-Claron 240
So without a real plan, only a rough idea I went ahead and bought some materials to build myself a little lensboard. With a little wood, some glue and a few nails I quickly threw together that lensboard. Add some felt to make it completely light-tight and you’re done.
Tag: Film
Building a wooden Lensboard for the Schneider-Kreuznach G-Claron 240/11 WA
I recently bought a nice little repro lens, the Schneider-Kreuznach G-Claron 240/11 WA. The Schneider-Kreuznach archive contains some more information on the G-Claron 240
So without a real plan, only a rough idea I went ahead and bought some materials to build myself a little lensboard. With a little wood, some glue and a few nails I quickly threw together that lensboard. Add some felt to make it completely light-tight and you’re done.
Building a wooden Lensboard for the Schneider-Kreuznach G-Claron 240/11 WA
I recently bought a nice little repro lens, the Schneider-Kreuznach G-Claron 240/11 WA. The Schneider-Kreuznach archive contains some more information on the G-Claron 240
So without a real plan, only a rough idea I went ahead and bought some materials to build myself a little lensboard. With a little wood, some glue and a few nails I quickly threw together that lensboard. Add some felt to make it completely light-tight and you’re done.
Tag: G-Claron
Building a wooden Lensboard for the Schneider-Kreuznach G-Claron 240/11 WA
I recently bought a nice little repro lens, the Schneider-Kreuznach G-Claron 240/11 WA. The Schneider-Kreuznach archive contains some more information on the G-Claron 240
So without a real plan, only a rough idea I went ahead and bought some materials to build myself a little lensboard. With a little wood, some glue and a few nails I quickly threw together that lensboard. Add some felt to make it completely light-tight and you’re done.
Building a wooden Lensboard for the Schneider-Kreuznach G-Claron 240/11 WA
I recently bought a nice little repro lens, the Schneider-Kreuznach G-Claron 240/11 WA. The Schneider-Kreuznach archive contains some more information on the G-Claron 240
So without a real plan, only a rough idea I went ahead and bought some materials to build myself a little lensboard. With a little wood, some glue and a few nails I quickly threw together that lensboard. Add some felt to make it completely light-tight and you’re done.
Tag: Kreuznach
Building a wooden Lensboard for the Schneider-Kreuznach G-Claron 240/11 WA
I recently bought a nice little repro lens, the Schneider-Kreuznach G-Claron 240/11 WA. The Schneider-Kreuznach archive contains some more information on the G-Claron 240
So without a real plan, only a rough idea I went ahead and bought some materials to build myself a little lensboard. With a little wood, some glue and a few nails I quickly threw together that lensboard. Add some felt to make it completely light-tight and you’re done.
Building a wooden Lensboard for the Schneider-Kreuznach G-Claron 240/11 WA
I recently bought a nice little repro lens, the Schneider-Kreuznach G-Claron 240/11 WA. The Schneider-Kreuznach archive contains some more information on the G-Claron 240
So without a real plan, only a rough idea I went ahead and bought some materials to build myself a little lensboard. With a little wood, some glue and a few nails I quickly threw together that lensboard. Add some felt to make it completely light-tight and you’re done.
Tag: Largeformat
Building a wooden Lensboard for the Schneider-Kreuznach G-Claron 240/11 WA
I recently bought a nice little repro lens, the Schneider-Kreuznach G-Claron 240/11 WA. The Schneider-Kreuznach archive contains some more information on the G-Claron 240
So without a real plan, only a rough idea I went ahead and bought some materials to build myself a little lensboard. With a little wood, some glue and a few nails I quickly threw together that lensboard. Add some felt to make it completely light-tight and you’re done.
Building a wooden Lensboard for the Schneider-Kreuznach G-Claron 240/11 WA
I recently bought a nice little repro lens, the Schneider-Kreuznach G-Claron 240/11 WA. The Schneider-Kreuznach archive contains some more information on the G-Claron 240
So without a real plan, only a rough idea I went ahead and bought some materials to build myself a little lensboard. With a little wood, some glue and a few nails I quickly threw together that lensboard. Add some felt to make it completely light-tight and you’re done.
Tag: Lensboard
Building a wooden Lensboard for the Schneider-Kreuznach G-Claron 240/11 WA
I recently bought a nice little repro lens, the Schneider-Kreuznach G-Claron 240/11 WA. The Schneider-Kreuznach archive contains some more information on the G-Claron 240
So without a real plan, only a rough idea I went ahead and bought some materials to build myself a little lensboard. With a little wood, some glue and a few nails I quickly threw together that lensboard. Add some felt to make it completely light-tight and you’re done.
Building a wooden Lensboard for the Schneider-Kreuznach G-Claron 240/11 WA
I recently bought a nice little repro lens, the Schneider-Kreuznach G-Claron 240/11 WA. The Schneider-Kreuznach archive contains some more information on the G-Claron 240
So without a real plan, only a rough idea I went ahead and bought some materials to build myself a little lensboard. With a little wood, some glue and a few nails I quickly threw together that lensboard. Add some felt to make it completely light-tight and you’re done.
Tag: Schneider
Building a wooden Lensboard for the Schneider-Kreuznach G-Claron 240/11 WA
I recently bought a nice little repro lens, the Schneider-Kreuznach G-Claron 240/11 WA. The Schneider-Kreuznach archive contains some more information on the G-Claron 240
So without a real plan, only a rough idea I went ahead and bought some materials to build myself a little lensboard. With a little wood, some glue and a few nails I quickly threw together that lensboard. Add some felt to make it completely light-tight and you’re done.
Building a wooden Lensboard for the Schneider-Kreuznach G-Claron 240/11 WA
I recently bought a nice little repro lens, the Schneider-Kreuznach G-Claron 240/11 WA. The Schneider-Kreuznach archive contains some more information on the G-Claron 240
So without a real plan, only a rough idea I went ahead and bought some materials to build myself a little lensboard. With a little wood, some glue and a few nails I quickly threw together that lensboard. Add some felt to make it completely light-tight and you’re done.
Tag: Sinar
Building a wooden Lensboard for the Schneider-Kreuznach G-Claron 240/11 WA
I recently bought a nice little repro lens, the Schneider-Kreuznach G-Claron 240/11 WA. The Schneider-Kreuznach archive contains some more information on the G-Claron 240
So without a real plan, only a rough idea I went ahead and bought some materials to build myself a little lensboard. With a little wood, some glue and a few nails I quickly threw together that lensboard. Add some felt to make it completely light-tight and you’re done.
Building a wooden Lensboard for the Schneider-Kreuznach G-Claron 240/11 WA
I recently bought a nice little repro lens, the Schneider-Kreuznach G-Claron 240/11 WA. The Schneider-Kreuznach archive contains some more information on the G-Claron 240
So without a real plan, only a rough idea I went ahead and bought some materials to build myself a little lensboard. With a little wood, some glue and a few nails I quickly threw together that lensboard. Add some felt to make it completely light-tight and you’re done.
Tag: Environment
Tanuki Software's Java Service Wrapper and your Environment Variables
This post has been updated
Here’s an evil little gotcha I ran into when using the Tanuki Software Java Service Wrapper to run our company bamboo server. But first you need to know a little bit of context:
We had our bamboo agent running as root for a long time and about two months ago I changed that. I created a bamboo user and installed multiple agents. Each of these into its own folder along the lines of /home/bamboo/bamboo/java-agent-1
etc. Now in each of these is a wrapper script under bin/bamboo-agent.sh
. Reading up a bit on the wrapper I found out I can change the script and set the variable RUN_AS_USER=bamboo
and then simply create a symlink from /etc/init.d/java-agent-1
to /home/bamboo/bamboo/java-agent-1/bin/bamboo-agent.sh
and then start/stop the service just like any other normal unix service.
Tanuki Software's Java Service Wrapper and your Environment Variables
This post has been updated
Here’s an evil little gotcha I ran into when using the Tanuki Software Java Service Wrapper to run our company bamboo server. But first you need to know a little bit of context:
We had our bamboo agent running as root for a long time and about two months ago I changed that. I created a bamboo user and installed multiple agents. Each of these into its own folder along the lines of /home/bamboo/bamboo/java-agent-1
etc. Now in each of these is a wrapper script under bin/bamboo-agent.sh
. Reading up a bit on the wrapper I found out I can change the script and set the variable RUN_AS_USER=bamboo
and then simply create a symlink from /etc/init.d/java-agent-1
to /home/bamboo/bamboo/java-agent-1/bin/bamboo-agent.sh
and then start/stop the service just like any other normal unix service.
Tag: Home
Tanuki Software's Java Service Wrapper and your Environment Variables
This post has been updated
Here’s an evil little gotcha I ran into when using the Tanuki Software Java Service Wrapper to run our company bamboo server. But first you need to know a little bit of context:
We had our bamboo agent running as root for a long time and about two months ago I changed that. I created a bamboo user and installed multiple agents. Each of these into its own folder along the lines of /home/bamboo/bamboo/java-agent-1
etc. Now in each of these is a wrapper script under bin/bamboo-agent.sh
. Reading up a bit on the wrapper I found out I can change the script and set the variable RUN_AS_USER=bamboo
and then simply create a symlink from /etc/init.d/java-agent-1
to /home/bamboo/bamboo/java-agent-1/bin/bamboo-agent.sh
and then start/stop the service just like any other normal unix service.
Tanuki Software's Java Service Wrapper and your Environment Variables
This post has been updated
Here’s an evil little gotcha I ran into when using the Tanuki Software Java Service Wrapper to run our company bamboo server. But first you need to know a little bit of context:
We had our bamboo agent running as root for a long time and about two months ago I changed that. I created a bamboo user and installed multiple agents. Each of these into its own folder along the lines of /home/bamboo/bamboo/java-agent-1
etc. Now in each of these is a wrapper script under bin/bamboo-agent.sh
. Reading up a bit on the wrapper I found out I can change the script and set the variable RUN_AS_USER=bamboo
and then simply create a symlink from /etc/init.d/java-agent-1
to /home/bamboo/bamboo/java-agent-1/bin/bamboo-agent.sh
and then start/stop the service just like any other normal unix service.
Tag: Root
Tanuki Software's Java Service Wrapper and your Environment Variables
This post has been updated
Here’s an evil little gotcha I ran into when using the Tanuki Software Java Service Wrapper to run our company bamboo server. But first you need to know a little bit of context:
We had our bamboo agent running as root for a long time and about two months ago I changed that. I created a bamboo user and installed multiple agents. Each of these into its own folder along the lines of /home/bamboo/bamboo/java-agent-1
etc. Now in each of these is a wrapper script under bin/bamboo-agent.sh
. Reading up a bit on the wrapper I found out I can change the script and set the variable RUN_AS_USER=bamboo
and then simply create a symlink from /etc/init.d/java-agent-1
to /home/bamboo/bamboo/java-agent-1/bin/bamboo-agent.sh
and then start/stop the service just like any other normal unix service.
Tanuki Software's Java Service Wrapper and your Environment Variables
This post has been updated
Here’s an evil little gotcha I ran into when using the Tanuki Software Java Service Wrapper to run our company bamboo server. But first you need to know a little bit of context:
We had our bamboo agent running as root for a long time and about two months ago I changed that. I created a bamboo user and installed multiple agents. Each of these into its own folder along the lines of /home/bamboo/bamboo/java-agent-1
etc. Now in each of these is a wrapper script under bin/bamboo-agent.sh
. Reading up a bit on the wrapper I found out I can change the script and set the variable RUN_AS_USER=bamboo
and then simply create a symlink from /etc/init.d/java-agent-1
to /home/bamboo/bamboo/java-agent-1/bin/bamboo-agent.sh
and then start/stop the service just like any other normal unix service.
Tag: Service
Tanuki Software's Java Service Wrapper and your Environment Variables
This post has been updated
Here’s an evil little gotcha I ran into when using the Tanuki Software Java Service Wrapper to run our company bamboo server. But first you need to know a little bit of context:
We had our bamboo agent running as root for a long time and about two months ago I changed that. I created a bamboo user and installed multiple agents. Each of these into its own folder along the lines of /home/bamboo/bamboo/java-agent-1
etc. Now in each of these is a wrapper script under bin/bamboo-agent.sh
. Reading up a bit on the wrapper I found out I can change the script and set the variable RUN_AS_USER=bamboo
and then simply create a symlink from /etc/init.d/java-agent-1
to /home/bamboo/bamboo/java-agent-1/bin/bamboo-agent.sh
and then start/stop the service just like any other normal unix service.
Tanuki Software's Java Service Wrapper and your Environment Variables
This post has been updated
Here’s an evil little gotcha I ran into when using the Tanuki Software Java Service Wrapper to run our company bamboo server. But first you need to know a little bit of context:
We had our bamboo agent running as root for a long time and about two months ago I changed that. I created a bamboo user and installed multiple agents. Each of these into its own folder along the lines of /home/bamboo/bamboo/java-agent-1
etc. Now in each of these is a wrapper script under bin/bamboo-agent.sh
. Reading up a bit on the wrapper I found out I can change the script and set the variable RUN_AS_USER=bamboo
and then simply create a symlink from /etc/init.d/java-agent-1
to /home/bamboo/bamboo/java-agent-1/bin/bamboo-agent.sh
and then start/stop the service just like any other normal unix service.
Tag: Tanuki
Tanuki Software's Java Service Wrapper and your Environment Variables
This post has been updated
Here’s an evil little gotcha I ran into when using the Tanuki Software Java Service Wrapper to run our company bamboo server. But first you need to know a little bit of context:
We had our bamboo agent running as root for a long time and about two months ago I changed that. I created a bamboo user and installed multiple agents. Each of these into its own folder along the lines of /home/bamboo/bamboo/java-agent-1
etc. Now in each of these is a wrapper script under bin/bamboo-agent.sh
. Reading up a bit on the wrapper I found out I can change the script and set the variable RUN_AS_USER=bamboo
and then simply create a symlink from /etc/init.d/java-agent-1
to /home/bamboo/bamboo/java-agent-1/bin/bamboo-agent.sh
and then start/stop the service just like any other normal unix service.
Tanuki Software's Java Service Wrapper and your Environment Variables
This post has been updated
Here’s an evil little gotcha I ran into when using the Tanuki Software Java Service Wrapper to run our company bamboo server. But first you need to know a little bit of context:
We had our bamboo agent running as root for a long time and about two months ago I changed that. I created a bamboo user and installed multiple agents. Each of these into its own folder along the lines of /home/bamboo/bamboo/java-agent-1
etc. Now in each of these is a wrapper script under bin/bamboo-agent.sh
. Reading up a bit on the wrapper I found out I can change the script and set the variable RUN_AS_USER=bamboo
and then simply create a symlink from /etc/init.d/java-agent-1
to /home/bamboo/bamboo/java-agent-1/bin/bamboo-agent.sh
and then start/stop the service just like any other normal unix service.
Tag: Updated
Tanuki Software's Java Service Wrapper and your Environment Variables
This post has been updated
Here’s an evil little gotcha I ran into when using the Tanuki Software Java Service Wrapper to run our company bamboo server. But first you need to know a little bit of context:
We had our bamboo agent running as root for a long time and about two months ago I changed that. I created a bamboo user and installed multiple agents. Each of these into its own folder along the lines of /home/bamboo/bamboo/java-agent-1
etc. Now in each of these is a wrapper script under bin/bamboo-agent.sh
. Reading up a bit on the wrapper I found out I can change the script and set the variable RUN_AS_USER=bamboo
and then simply create a symlink from /etc/init.d/java-agent-1
to /home/bamboo/bamboo/java-agent-1/bin/bamboo-agent.sh
and then start/stop the service just like any other normal unix service.
Tanuki Software's Java Service Wrapper and your Environment Variables
This post has been updated
Here’s an evil little gotcha I ran into when using the Tanuki Software Java Service Wrapper to run our company bamboo server. But first you need to know a little bit of context:
We had our bamboo agent running as root for a long time and about two months ago I changed that. I created a bamboo user and installed multiple agents. Each of these into its own folder along the lines of /home/bamboo/bamboo/java-agent-1
etc. Now in each of these is a wrapper script under bin/bamboo-agent.sh
. Reading up a bit on the wrapper I found out I can change the script and set the variable RUN_AS_USER=bamboo
and then simply create a symlink from /etc/init.d/java-agent-1
to /home/bamboo/bamboo/java-agent-1/bin/bamboo-agent.sh
and then start/stop the service just like any other normal unix service.
Tag: Variables
Tanuki Software's Java Service Wrapper and your Environment Variables
This post has been updated
Here’s an evil little gotcha I ran into when using the Tanuki Software Java Service Wrapper to run our company bamboo server. But first you need to know a little bit of context:
We had our bamboo agent running as root for a long time and about two months ago I changed that. I created a bamboo user and installed multiple agents. Each of these into its own folder along the lines of /home/bamboo/bamboo/java-agent-1
etc. Now in each of these is a wrapper script under bin/bamboo-agent.sh
. Reading up a bit on the wrapper I found out I can change the script and set the variable RUN_AS_USER=bamboo
and then simply create a symlink from /etc/init.d/java-agent-1
to /home/bamboo/bamboo/java-agent-1/bin/bamboo-agent.sh
and then start/stop the service just like any other normal unix service.
Tanuki Software's Java Service Wrapper and your Environment Variables
This post has been updated
Here’s an evil little gotcha I ran into when using the Tanuki Software Java Service Wrapper to run our company bamboo server. But first you need to know a little bit of context:
We had our bamboo agent running as root for a long time and about two months ago I changed that. I created a bamboo user and installed multiple agents. Each of these into its own folder along the lines of /home/bamboo/bamboo/java-agent-1
etc. Now in each of these is a wrapper script under bin/bamboo-agent.sh
. Reading up a bit on the wrapper I found out I can change the script and set the variable RUN_AS_USER=bamboo
and then simply create a symlink from /etc/init.d/java-agent-1
to /home/bamboo/bamboo/java-agent-1/bin/bamboo-agent.sh
and then start/stop the service just like any other normal unix service.
Tag: Wrapper
Tanuki Software's Java Service Wrapper and your Environment Variables
This post has been updated
Here’s an evil little gotcha I ran into when using the Tanuki Software Java Service Wrapper to run our company bamboo server. But first you need to know a little bit of context:
We had our bamboo agent running as root for a long time and about two months ago I changed that. I created a bamboo user and installed multiple agents. Each of these into its own folder along the lines of /home/bamboo/bamboo/java-agent-1
etc. Now in each of these is a wrapper script under bin/bamboo-agent.sh
. Reading up a bit on the wrapper I found out I can change the script and set the variable RUN_AS_USER=bamboo
and then simply create a symlink from /etc/init.d/java-agent-1
to /home/bamboo/bamboo/java-agent-1/bin/bamboo-agent.sh
and then start/stop the service just like any other normal unix service.
Tanuki Software's Java Service Wrapper and your Environment Variables
This post has been updated
Here’s an evil little gotcha I ran into when using the Tanuki Software Java Service Wrapper to run our company bamboo server. But first you need to know a little bit of context:
We had our bamboo agent running as root for a long time and about two months ago I changed that. I created a bamboo user and installed multiple agents. Each of these into its own folder along the lines of /home/bamboo/bamboo/java-agent-1
etc. Now in each of these is a wrapper script under bin/bamboo-agent.sh
. Reading up a bit on the wrapper I found out I can change the script and set the variable RUN_AS_USER=bamboo
and then simply create a symlink from /etc/init.d/java-agent-1
to /home/bamboo/bamboo/java-agent-1/bin/bamboo-agent.sh
and then start/stop the service just like any other normal unix service.
Tag: Apache
Using WSDLToJava with Gradle
I currently need to generate some java sources from a wsdl with Gradle. In Maven, I’ve done this using the cxf-codegen-plugin but I want to avoid using anything that’s maven in my new build. And there’s several resources out there who will either tell you to call WSDLToJava via cxf’s ant task or directly. The ‘direct’ way still leaves you with several possibilities like calling ProcessBuilder from your Gradle Task or using a JavaExec Task. I went for the last.
Using WSDLToJava with Gradle
I currently need to generate some java sources from a wsdl with Gradle. In Maven, I’ve done this using the cxf-codegen-plugin but I want to avoid using anything that’s maven in my new build. And there’s several resources out there who will either tell you to call WSDLToJava via cxf’s ant task or directly. The ‘direct’ way still leaves you with several possibilities like calling ProcessBuilder from your Gradle Task or using a JavaExec Task. I went for the last.
Tag: Cxf
Using WSDLToJava with Gradle
I currently need to generate some java sources from a wsdl with Gradle. In Maven, I’ve done this using the cxf-codegen-plugin but I want to avoid using anything that’s maven in my new build. And there’s several resources out there who will either tell you to call WSDLToJava via cxf’s ant task or directly. The ‘direct’ way still leaves you with several possibilities like calling ProcessBuilder from your Gradle Task or using a JavaExec Task. I went for the last.
Using WSDLToJava with Gradle
I currently need to generate some java sources from a wsdl with Gradle. In Maven, I’ve done this using the cxf-codegen-plugin but I want to avoid using anything that’s maven in my new build. And there’s several resources out there who will either tell you to call WSDLToJava via cxf’s ant task or directly. The ‘direct’ way still leaves you with several possibilities like calling ProcessBuilder from your Gradle Task or using a JavaExec Task. I went for the last.
Tag: Task
Using WSDLToJava with Gradle
I currently need to generate some java sources from a wsdl with Gradle. In Maven, I’ve done this using the cxf-codegen-plugin but I want to avoid using anything that’s maven in my new build. And there’s several resources out there who will either tell you to call WSDLToJava via cxf’s ant task or directly. The ‘direct’ way still leaves you with several possibilities like calling ProcessBuilder from your Gradle Task or using a JavaExec Task. I went for the last.
Using WSDLToJava with Gradle
I currently need to generate some java sources from a wsdl with Gradle. In Maven, I’ve done this using the cxf-codegen-plugin but I want to avoid using anything that’s maven in my new build. And there’s several resources out there who will either tell you to call WSDLToJava via cxf’s ant task or directly. The ‘direct’ way still leaves you with several possibilities like calling ProcessBuilder from your Gradle Task or using a JavaExec Task. I went for the last.
Anatomy of a Gradle Task: Task Dependencies
Here’s another little “gotcha” on gradle tasks. When they’re triggered. I’ve seen quite a few posts on stackoverflow along the lines of “How can I call a gradle task”. You can’t (Well ok you could but you don’t want to). It is executed as a dependency of another task.
Going back to that little copy task I wanted to execute, I needed it to run roughly somewhere after the clean task and before jar task. I ended up making my copy task dependent of compile. That didn’t work so I tried compileJava and classes and… only to later realize that the dependency was in the wrong direction: My task depends on one that is being executed but that doesn’t trigger my task, it merely states that my task, should it be executed, can only be run after some other task.
Anatomy of a Gradle Task: Task Dependencies
Here’s another little “gotcha” on gradle tasks. When they’re triggered. I’ve seen quite a few posts on stackoverflow along the lines of “How can I call a gradle task”. You can’t (Well ok you could but you don’t want to). It is executed as a dependency of another task.
Going back to that little copy task I wanted to execute, I needed it to run roughly somewhere after the clean task and before jar task. I ended up making my copy task dependent of compile. That didn’t work so I tried compileJava and classes and… only to later realize that the dependency was in the wrong direction: My task depends on one that is being executed but that doesn’t trigger my task, it merely states that my task, should it be executed, can only be run after some other task.
Anatomy of a Gradle Task: Execution Time
So I’m trying out gradle in the hope of being able to simplify and streamline an existing maven build which is quite large and inconsistent. And I just spent some time to figure out a few gotcha about gradle tasks and this post is to help others understand, because I think the gradle docs aren’t clear enough on this point
What I wanted to achieve is, to copy all *.properties files in src/main/java to the resources-output directory so they are included in the jar, because javac won’t pick them up since they’re no java source files. What happened was that, depending on what approach I tried, sometimes the copy worked sometimes it didn’t and I couldn’t understand why.
Anatomy of a Gradle Task: Execution Time
So I’m trying out gradle in the hope of being able to simplify and streamline an existing maven build which is quite large and inconsistent. And I just spent some time to figure out a few gotcha about gradle tasks and this post is to help others understand, because I think the gradle docs aren’t clear enough on this point
What I wanted to achieve is, to copy all *.properties files in src/main/java to the resources-output directory so they are included in the jar, because javac won’t pick them up since they’re no java source files. What happened was that, depending on what approach I tried, sometimes the copy worked sometimes it didn’t and I couldn’t understand why.
Tag: Tasks
Using WSDLToJava with Gradle
I currently need to generate some java sources from a wsdl with Gradle. In Maven, I’ve done this using the cxf-codegen-plugin but I want to avoid using anything that’s maven in my new build. And there’s several resources out there who will either tell you to call WSDLToJava via cxf’s ant task or directly. The ‘direct’ way still leaves you with several possibilities like calling ProcessBuilder from your Gradle Task or using a JavaExec Task. I went for the last.
Using WSDLToJava with Gradle
I currently need to generate some java sources from a wsdl with Gradle. In Maven, I’ve done this using the cxf-codegen-plugin but I want to avoid using anything that’s maven in my new build. And there’s several resources out there who will either tell you to call WSDLToJava via cxf’s ant task or directly. The ‘direct’ way still leaves you with several possibilities like calling ProcessBuilder from your Gradle Task or using a JavaExec Task. I went for the last.
Anatomy of a Gradle Task: Task Dependencies
Here’s another little “gotcha” on gradle tasks. When they’re triggered. I’ve seen quite a few posts on stackoverflow along the lines of “How can I call a gradle task”. You can’t (Well ok you could but you don’t want to). It is executed as a dependency of another task.
Going back to that little copy task I wanted to execute, I needed it to run roughly somewhere after the clean task and before jar task. I ended up making my copy task dependent of compile. That didn’t work so I tried compileJava and classes and… only to later realize that the dependency was in the wrong direction: My task depends on one that is being executed but that doesn’t trigger my task, it merely states that my task, should it be executed, can only be run after some other task.
Anatomy of a Gradle Task: Task Dependencies
Here’s another little “gotcha” on gradle tasks. When they’re triggered. I’ve seen quite a few posts on stackoverflow along the lines of “How can I call a gradle task”. You can’t (Well ok you could but you don’t want to). It is executed as a dependency of another task.
Going back to that little copy task I wanted to execute, I needed it to run roughly somewhere after the clean task and before jar task. I ended up making my copy task dependent of compile. That didn’t work so I tried compileJava and classes and… only to later realize that the dependency was in the wrong direction: My task depends on one that is being executed but that doesn’t trigger my task, it merely states that my task, should it be executed, can only be run after some other task.
Anatomy of a Gradle Task: Execution Time
So I’m trying out gradle in the hope of being able to simplify and streamline an existing maven build which is quite large and inconsistent. And I just spent some time to figure out a few gotcha about gradle tasks and this post is to help others understand, because I think the gradle docs aren’t clear enough on this point
What I wanted to achieve is, to copy all *.properties files in src/main/java to the resources-output directory so they are included in the jar, because javac won’t pick them up since they’re no java source files. What happened was that, depending on what approach I tried, sometimes the copy worked sometimes it didn’t and I couldn’t understand why.
Anatomy of a Gradle Task: Execution Time
So I’m trying out gradle in the hope of being able to simplify and streamline an existing maven build which is quite large and inconsistent. And I just spent some time to figure out a few gotcha about gradle tasks and this post is to help others understand, because I think the gradle docs aren’t clear enough on this point
What I wanted to achieve is, to copy all *.properties files in src/main/java to the resources-output directory so they are included in the jar, because javac won’t pick them up since they’re no java source files. What happened was that, depending on what approach I tried, sometimes the copy worked sometimes it didn’t and I couldn’t understand why.
Tag: Wsdl2java
Using WSDLToJava with Gradle
I currently need to generate some java sources from a wsdl with Gradle. In Maven, I’ve done this using the cxf-codegen-plugin but I want to avoid using anything that’s maven in my new build. And there’s several resources out there who will either tell you to call WSDLToJava via cxf’s ant task or directly. The ‘direct’ way still leaves you with several possibilities like calling ProcessBuilder from your Gradle Task or using a JavaExec Task. I went for the last.
Using WSDLToJava with Gradle
I currently need to generate some java sources from a wsdl with Gradle. In Maven, I’ve done this using the cxf-codegen-plugin but I want to avoid using anything that’s maven in my new build. And there’s several resources out there who will either tell you to call WSDLToJava via cxf’s ant task or directly. The ‘direct’ way still leaves you with several possibilities like calling ProcessBuilder from your Gradle Task or using a JavaExec Task. I went for the last.
Tag: Wsdltojava
Using WSDLToJava with Gradle
I currently need to generate some java sources from a wsdl with Gradle. In Maven, I’ve done this using the cxf-codegen-plugin but I want to avoid using anything that’s maven in my new build. And there’s several resources out there who will either tell you to call WSDLToJava via cxf’s ant task or directly. The ‘direct’ way still leaves you with several possibilities like calling ProcessBuilder from your Gradle Task or using a JavaExec Task. I went for the last.
Using WSDLToJava with Gradle
I currently need to generate some java sources from a wsdl with Gradle. In Maven, I’ve done this using the cxf-codegen-plugin but I want to avoid using anything that’s maven in my new build. And there’s several resources out there who will either tell you to call WSDLToJava via cxf’s ant task or directly. The ‘direct’ way still leaves you with several possibilities like calling ProcessBuilder from your Gradle Task or using a JavaExec Task. I went for the last.
Tag: Config
Automatically switch your mvn settings
I have one primary development notebook and take that with me wherever I go. Now in a company setting you usually have proxies and whatnot. And I think it’s an official best practice to roll your own company or departement maven proxy/cache.
So dependent on where you are, you might need a different maven ~/.m2/settings.xml file. Here’s a very simple shell function you can add to your ~/.bashrc
function mvn {
MVN="$(which mvn)"
if [ -n "$(ifconfig eth0 | grep YOUR-WORK-IP-HERE)" ]; then
echo ">>> Running mvn whith work config"
${MVN} -gs ${HOME}/.m2/settings-work.xml $*
else
echo ">>> Running mvn with vanilla config"
${MVN} $*
fi
}
This just checks for the ip of eth0 and calls mvn with a special settings.xml. In all other cases mvn is run with the vanilla config (or none, since the settings.xml is optional).
Automatically switch your mvn settings
I have one primary development notebook and take that with me wherever I go. Now in a company setting you usually have proxies and whatnot. And I think it’s an official best practice to roll your own company or departement maven proxy/cache.
So dependent on where you are, you might need a different maven ~/.m2/settings.xml file. Here’s a very simple shell function you can add to your ~/.bashrc
function mvn {
MVN="$(which mvn)"
if [ -n "$(ifconfig eth0 | grep YOUR-WORK-IP-HERE)" ]; then
echo ">>> Running mvn whith work config"
${MVN} -gs ${HOME}/.m2/settings-work.xml $*
else
echo ">>> Running mvn with vanilla config"
${MVN} $*
fi
}
This just checks for the ip of eth0 and calls mvn with a special settings.xml. In all other cases mvn is run with the vanilla config (or none, since the settings.xml is optional).
Tag: Mvn
Automatically switch your mvn settings
I have one primary development notebook and take that with me wherever I go. Now in a company setting you usually have proxies and whatnot. And I think it’s an official best practice to roll your own company or departement maven proxy/cache.
So dependent on where you are, you might need a different maven ~/.m2/settings.xml file. Here’s a very simple shell function you can add to your ~/.bashrc
function mvn {
MVN="$(which mvn)"
if [ -n "$(ifconfig eth0 | grep YOUR-WORK-IP-HERE)" ]; then
echo ">>> Running mvn whith work config"
${MVN} -gs ${HOME}/.m2/settings-work.xml $*
else
echo ">>> Running mvn with vanilla config"
${MVN} $*
fi
}
This just checks for the ip of eth0 and calls mvn with a special settings.xml. In all other cases mvn is run with the vanilla config (or none, since the settings.xml is optional).
Automatically switch your mvn settings
I have one primary development notebook and take that with me wherever I go. Now in a company setting you usually have proxies and whatnot. And I think it’s an official best practice to roll your own company or departement maven proxy/cache.
So dependent on where you are, you might need a different maven ~/.m2/settings.xml file. Here’s a very simple shell function you can add to your ~/.bashrc
function mvn {
MVN="$(which mvn)"
if [ -n "$(ifconfig eth0 | grep YOUR-WORK-IP-HERE)" ]; then
echo ">>> Running mvn whith work config"
${MVN} -gs ${HOME}/.m2/settings-work.xml $*
else
echo ">>> Running mvn with vanilla config"
${MVN} $*
fi
}
This just checks for the ip of eth0 and calls mvn with a special settings.xml. In all other cases mvn is run with the vanilla config (or none, since the settings.xml is optional).
Tag: Private
Automatically switch your mvn settings
I have one primary development notebook and take that with me wherever I go. Now in a company setting you usually have proxies and whatnot. And I think it’s an official best practice to roll your own company or departement maven proxy/cache.
So dependent on where you are, you might need a different maven ~/.m2/settings.xml file. Here’s a very simple shell function you can add to your ~/.bashrc
function mvn {
MVN="$(which mvn)"
if [ -n "$(ifconfig eth0 | grep YOUR-WORK-IP-HERE)" ]; then
echo ">>> Running mvn whith work config"
${MVN} -gs ${HOME}/.m2/settings-work.xml $*
else
echo ">>> Running mvn with vanilla config"
${MVN} $*
fi
}
This just checks for the ip of eth0 and calls mvn with a special settings.xml. In all other cases mvn is run with the vanilla config (or none, since the settings.xml is optional).
Automatically switch your mvn settings
I have one primary development notebook and take that with me wherever I go. Now in a company setting you usually have proxies and whatnot. And I think it’s an official best practice to roll your own company or departement maven proxy/cache.
So dependent on where you are, you might need a different maven ~/.m2/settings.xml file. Here’s a very simple shell function you can add to your ~/.bashrc
function mvn {
MVN="$(which mvn)"
if [ -n "$(ifconfig eth0 | grep YOUR-WORK-IP-HERE)" ]; then
echo ">>> Running mvn whith work config"
${MVN} -gs ${HOME}/.m2/settings-work.xml $*
else
echo ">>> Running mvn with vanilla config"
${MVN} $*
fi
}
This just checks for the ip of eth0 and calls mvn with a special settings.xml. In all other cases mvn is run with the vanilla config (or none, since the settings.xml is optional).
Tag: Settings
Automatically switch your mvn settings
I have one primary development notebook and take that with me wherever I go. Now in a company setting you usually have proxies and whatnot. And I think it’s an official best practice to roll your own company or departement maven proxy/cache.
So dependent on where you are, you might need a different maven ~/.m2/settings.xml file. Here’s a very simple shell function you can add to your ~/.bashrc
function mvn {
MVN="$(which mvn)"
if [ -n "$(ifconfig eth0 | grep YOUR-WORK-IP-HERE)" ]; then
echo ">>> Running mvn whith work config"
${MVN} -gs ${HOME}/.m2/settings-work.xml $*
else
echo ">>> Running mvn with vanilla config"
${MVN} $*
fi
}
This just checks for the ip of eth0 and calls mvn with a special settings.xml. In all other cases mvn is run with the vanilla config (or none, since the settings.xml is optional).
Automatically switch your mvn settings
I have one primary development notebook and take that with me wherever I go. Now in a company setting you usually have proxies and whatnot. And I think it’s an official best practice to roll your own company or departement maven proxy/cache.
So dependent on where you are, you might need a different maven ~/.m2/settings.xml file. Here’s a very simple shell function you can add to your ~/.bashrc
function mvn {
MVN="$(which mvn)"
if [ -n "$(ifconfig eth0 | grep YOUR-WORK-IP-HERE)" ]; then
echo ">>> Running mvn whith work config"
${MVN} -gs ${HOME}/.m2/settings-work.xml $*
else
echo ">>> Running mvn with vanilla config"
${MVN} $*
fi
}
This just checks for the ip of eth0 and calls mvn with a special settings.xml. In all other cases mvn is run with the vanilla config (or none, since the settings.xml is optional).
Tag: Work
Automatically switch your mvn settings
I have one primary development notebook and take that with me wherever I go. Now in a company setting you usually have proxies and whatnot. And I think it’s an official best practice to roll your own company or departement maven proxy/cache.
So dependent on where you are, you might need a different maven ~/.m2/settings.xml file. Here’s a very simple shell function you can add to your ~/.bashrc
function mvn {
MVN="$(which mvn)"
if [ -n "$(ifconfig eth0 | grep YOUR-WORK-IP-HERE)" ]; then
echo ">>> Running mvn whith work config"
${MVN} -gs ${HOME}/.m2/settings-work.xml $*
else
echo ">>> Running mvn with vanilla config"
${MVN} $*
fi
}
This just checks for the ip of eth0 and calls mvn with a special settings.xml. In all other cases mvn is run with the vanilla config (or none, since the settings.xml is optional).
Automatically switch your mvn settings
I have one primary development notebook and take that with me wherever I go. Now in a company setting you usually have proxies and whatnot. And I think it’s an official best practice to roll your own company or departement maven proxy/cache.
So dependent on where you are, you might need a different maven ~/.m2/settings.xml file. Here’s a very simple shell function you can add to your ~/.bashrc
function mvn {
MVN="$(which mvn)"
if [ -n "$(ifconfig eth0 | grep YOUR-WORK-IP-HERE)" ]; then
echo ">>> Running mvn whith work config"
${MVN} -gs ${HOME}/.m2/settings-work.xml $*
else
echo ">>> Running mvn with vanilla config"
${MVN} $*
fi
}
This just checks for the ip of eth0 and calls mvn with a special settings.xml. In all other cases mvn is run with the vanilla config (or none, since the settings.xml is optional).
Tag: Cert
Mercurial and self-signed server certificates
So mercurial aborts when you want to interact with a repository that uses a self-signed certificate, as is the case for my own little mercurial repo exposed over https.
NOTE: this is obviously insecure and you must verify the ssl cert’s fingerprint is correct. If you roll your own server, log into the server and get the fingerprint from the cert file itself, not over https since there could be a man in the middle.
Mercurial and self-signed server certificates
So mercurial aborts when you want to interact with a repository that uses a self-signed certificate, as is the case for my own little mercurial repo exposed over https.
NOTE: this is obviously insecure and you must verify the ssl cert’s fingerprint is correct. If you roll your own server, log into the server and get the fingerprint from the cert file itself, not over https since there could be a man in the middle.
Tag: Certificate
Mercurial and self-signed server certificates
So mercurial aborts when you want to interact with a repository that uses a self-signed certificate, as is the case for my own little mercurial repo exposed over https.
NOTE: this is obviously insecure and you must verify the ssl cert’s fingerprint is correct. If you roll your own server, log into the server and get the fingerprint from the cert file itself, not over https since there could be a man in the middle.
Mercurial and self-signed server certificates
So mercurial aborts when you want to interact with a repository that uses a self-signed certificate, as is the case for my own little mercurial repo exposed over https.
NOTE: this is obviously insecure and you must verify the ssl cert’s fingerprint is correct. If you roll your own server, log into the server and get the fingerprint from the cert file itself, not over https since there could be a man in the middle.
Tag: Hg
Mercurial and self-signed server certificates
So mercurial aborts when you want to interact with a repository that uses a self-signed certificate, as is the case for my own little mercurial repo exposed over https.
NOTE: this is obviously insecure and you must verify the ssl cert’s fingerprint is correct. If you roll your own server, log into the server and get the fingerprint from the cert file itself, not over https since there could be a man in the middle.
Mercurial and self-signed server certificates
So mercurial aborts when you want to interact with a repository that uses a self-signed certificate, as is the case for my own little mercurial repo exposed over https.
NOTE: this is obviously insecure and you must verify the ssl cert’s fingerprint is correct. If you roll your own server, log into the server and get the fingerprint from the cert file itself, not over https since there could be a man in the middle.
Tag: Mercurial
Mercurial and self-signed server certificates
So mercurial aborts when you want to interact with a repository that uses a self-signed certificate, as is the case for my own little mercurial repo exposed over https.
NOTE: this is obviously insecure and you must verify the ssl cert’s fingerprint is correct. If you roll your own server, log into the server and get the fingerprint from the cert file itself, not over https since there could be a man in the middle.
Mercurial and self-signed server certificates
So mercurial aborts when you want to interact with a repository that uses a self-signed certificate, as is the case for my own little mercurial repo exposed over https.
NOTE: this is obviously insecure and you must verify the ssl cert’s fingerprint is correct. If you roll your own server, log into the server and get the fingerprint from the cert file itself, not over https since there could be a man in the middle.
Tag: Openssl
Mercurial and self-signed server certificates
So mercurial aborts when you want to interact with a repository that uses a self-signed certificate, as is the case for my own little mercurial repo exposed over https.
NOTE: this is obviously insecure and you must verify the ssl cert’s fingerprint is correct. If you roll your own server, log into the server and get the fingerprint from the cert file itself, not over https since there could be a man in the middle.
Mercurial and self-signed server certificates
So mercurial aborts when you want to interact with a repository that uses a self-signed certificate, as is the case for my own little mercurial repo exposed over https.
NOTE: this is obviously insecure and you must verify the ssl cert’s fingerprint is correct. If you roll your own server, log into the server and get the fingerprint from the cert file itself, not over https since there could be a man in the middle.
Tag: Self
Mercurial and self-signed server certificates
So mercurial aborts when you want to interact with a repository that uses a self-signed certificate, as is the case for my own little mercurial repo exposed over https.
NOTE: this is obviously insecure and you must verify the ssl cert’s fingerprint is correct. If you roll your own server, log into the server and get the fingerprint from the cert file itself, not over https since there could be a man in the middle.
Mercurial and self-signed server certificates
So mercurial aborts when you want to interact with a repository that uses a self-signed certificate, as is the case for my own little mercurial repo exposed over https.
NOTE: this is obviously insecure and you must verify the ssl cert’s fingerprint is correct. If you roll your own server, log into the server and get the fingerprint from the cert file itself, not over https since there could be a man in the middle.
Tag: Selfsigned
Mercurial and self-signed server certificates
So mercurial aborts when you want to interact with a repository that uses a self-signed certificate, as is the case for my own little mercurial repo exposed over https.
NOTE: this is obviously insecure and you must verify the ssl cert’s fingerprint is correct. If you roll your own server, log into the server and get the fingerprint from the cert file itself, not over https since there could be a man in the middle.
Mercurial and self-signed server certificates
So mercurial aborts when you want to interact with a repository that uses a self-signed certificate, as is the case for my own little mercurial repo exposed over https.
NOTE: this is obviously insecure and you must verify the ssl cert’s fingerprint is correct. If you roll your own server, log into the server and get the fingerprint from the cert file itself, not over https since there could be a man in the middle.
Tag: Signed
Mercurial and self-signed server certificates
So mercurial aborts when you want to interact with a repository that uses a self-signed certificate, as is the case for my own little mercurial repo exposed over https.
NOTE: this is obviously insecure and you must verify the ssl cert’s fingerprint is correct. If you roll your own server, log into the server and get the fingerprint from the cert file itself, not over https since there could be a man in the middle.
Mercurial and self-signed server certificates
So mercurial aborts when you want to interact with a repository that uses a self-signed certificate, as is the case for my own little mercurial repo exposed over https.
NOTE: this is obviously insecure and you must verify the ssl cert’s fingerprint is correct. If you roll your own server, log into the server and get the fingerprint from the cert file itself, not over https since there could be a man in the middle.
Tag: Ssl
Mercurial and self-signed server certificates
So mercurial aborts when you want to interact with a repository that uses a self-signed certificate, as is the case for my own little mercurial repo exposed over https.
NOTE: this is obviously insecure and you must verify the ssl cert’s fingerprint is correct. If you roll your own server, log into the server and get the fingerprint from the cert file itself, not over https since there could be a man in the middle.
Mercurial and self-signed server certificates
So mercurial aborts when you want to interact with a repository that uses a self-signed certificate, as is the case for my own little mercurial repo exposed over https.
NOTE: this is obviously insecure and you must verify the ssl cert’s fingerprint is correct. If you roll your own server, log into the server and get the fingerprint from the cert file itself, not over https since there could be a man in the middle.
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Tag: X509
Mercurial and self-signed server certificates
So mercurial aborts when you want to interact with a repository that uses a self-signed certificate, as is the case for my own little mercurial repo exposed over https.
NOTE: this is obviously insecure and you must verify the ssl cert’s fingerprint is correct. If you roll your own server, log into the server and get the fingerprint from the cert file itself, not over https since there could be a man in the middle.
Mercurial and self-signed server certificates
So mercurial aborts when you want to interact with a repository that uses a self-signed certificate, as is the case for my own little mercurial repo exposed over https.
NOTE: this is obviously insecure and you must verify the ssl cert’s fingerprint is correct. If you roll your own server, log into the server and get the fingerprint from the cert file itself, not over https since there could be a man in the middle.
Tag: Pdf
The simplest way to create pdf's from images
There’s probably no simpler way to create a pdf from images. Behold:
convert \*.jpg output.pdf
That’s all folks!
The simplest way to create pdf's from images
There’s probably no simpler way to create a pdf from images. Behold:
convert \*.jpg output.pdf
That’s all folks!
Tag: Amazon
Using StarCluster for some heavy computing
Update: 2012-10-22
I recently put the scripts I used in a little github repo called starcluster-image-sharpener. It’s nothing big but it’ll get you started if needed. Go get them! ;)
Introduction
First a little context:
I’ve photographed a ton recently and on one occasion I screwed things up a bit and had a lot of slightly out of focus images. I’ve tinkered with the Canon Digital Photo Professional settings to find a good level of sharpening (after noise reduction because a lot was shot at high iso (for me and my 50D, iso 1250 is a lot already)) but wasn’t happy with it. I’ve found reasonable settings for noise reduction, which makes the image a bit softer, but sharpening wouldn’t do anything to compensate let alone actually sharpen.
Using StarCluster for some heavy computing
Update: 2012-10-22
I recently put the scripts I used in a little github repo called starcluster-image-sharpener. It’s nothing big but it’ll get you started if needed. Go get them! ;)
Introduction
First a little context:
I’ve photographed a ton recently and on one occasion I screwed things up a bit and had a lot of slightly out of focus images. I’ve tinkered with the Canon Digital Photo Professional settings to find a good level of sharpening (after noise reduction because a lot was shot at high iso (for me and my 50D, iso 1250 is a lot already)) but wasn’t happy with it. I’ve found reasonable settings for noise reduction, which makes the image a bit softer, but sharpening wouldn’t do anything to compensate let alone actually sharpen.
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Tag: Batch
Using StarCluster for some heavy computing
Update: 2012-10-22
I recently put the scripts I used in a little github repo called starcluster-image-sharpener. It’s nothing big but it’ll get you started if needed. Go get them! ;)
Introduction
First a little context:
I’ve photographed a ton recently and on one occasion I screwed things up a bit and had a lot of slightly out of focus images. I’ve tinkered with the Canon Digital Photo Professional settings to find a good level of sharpening (after noise reduction because a lot was shot at high iso (for me and my 50D, iso 1250 is a lot already)) but wasn’t happy with it. I’ve found reasonable settings for noise reduction, which makes the image a bit softer, but sharpening wouldn’t do anything to compensate let alone actually sharpen.
Using StarCluster for some heavy computing
Update: 2012-10-22
I recently put the scripts I used in a little github repo called starcluster-image-sharpener. It’s nothing big but it’ll get you started if needed. Go get them! ;)
Introduction
First a little context:
I’ve photographed a ton recently and on one occasion I screwed things up a bit and had a lot of slightly out of focus images. I’ve tinkered with the Canon Digital Photo Professional settings to find a good level of sharpening (after noise reduction because a lot was shot at high iso (for me and my 50D, iso 1250 is a lot already)) but wasn’t happy with it. I’ve found reasonable settings for noise reduction, which makes the image a bit softer, but sharpening wouldn’t do anything to compensate let alone actually sharpen.
Tag: Cluster
Using StarCluster for some heavy computing
Update: 2012-10-22
I recently put the scripts I used in a little github repo called starcluster-image-sharpener. It’s nothing big but it’ll get you started if needed. Go get them! ;)
Introduction
First a little context:
I’ve photographed a ton recently and on one occasion I screwed things up a bit and had a lot of slightly out of focus images. I’ve tinkered with the Canon Digital Photo Professional settings to find a good level of sharpening (after noise reduction because a lot was shot at high iso (for me and my 50D, iso 1250 is a lot already)) but wasn’t happy with it. I’ve found reasonable settings for noise reduction, which makes the image a bit softer, but sharpening wouldn’t do anything to compensate let alone actually sharpen.
Using StarCluster for some heavy computing
Update: 2012-10-22
I recently put the scripts I used in a little github repo called starcluster-image-sharpener. It’s nothing big but it’ll get you started if needed. Go get them! ;)
Introduction
First a little context:
I’ve photographed a ton recently and on one occasion I screwed things up a bit and had a lot of slightly out of focus images. I’ve tinkered with the Canon Digital Photo Professional settings to find a good level of sharpening (after noise reduction because a lot was shot at high iso (for me and my 50D, iso 1250 is a lot already)) but wasn’t happy with it. I’ve found reasonable settings for noise reduction, which makes the image a bit softer, but sharpening wouldn’t do anything to compensate let alone actually sharpen.
Tag: Computing
Using StarCluster for some heavy computing
Update: 2012-10-22
I recently put the scripts I used in a little github repo called starcluster-image-sharpener. It’s nothing big but it’ll get you started if needed. Go get them! ;)
Introduction
First a little context:
I’ve photographed a ton recently and on one occasion I screwed things up a bit and had a lot of slightly out of focus images. I’ve tinkered with the Canon Digital Photo Professional settings to find a good level of sharpening (after noise reduction because a lot was shot at high iso (for me and my 50D, iso 1250 is a lot already)) but wasn’t happy with it. I’ve found reasonable settings for noise reduction, which makes the image a bit softer, but sharpening wouldn’t do anything to compensate let alone actually sharpen.
Using StarCluster for some heavy computing
Update: 2012-10-22
I recently put the scripts I used in a little github repo called starcluster-image-sharpener. It’s nothing big but it’ll get you started if needed. Go get them! ;)
Introduction
First a little context:
I’ve photographed a ton recently and on one occasion I screwed things up a bit and had a lot of slightly out of focus images. I’ve tinkered with the Canon Digital Photo Professional settings to find a good level of sharpening (after noise reduction because a lot was shot at high iso (for me and my 50D, iso 1250 is a lot already)) but wasn’t happy with it. I’ve found reasonable settings for noise reduction, which makes the image a bit softer, but sharpening wouldn’t do anything to compensate let alone actually sharpen.
Tag: Convert
Using StarCluster for some heavy computing
Update: 2012-10-22
I recently put the scripts I used in a little github repo called starcluster-image-sharpener. It’s nothing big but it’ll get you started if needed. Go get them! ;)
Introduction
First a little context:
I’ve photographed a ton recently and on one occasion I screwed things up a bit and had a lot of slightly out of focus images. I’ve tinkered with the Canon Digital Photo Professional settings to find a good level of sharpening (after noise reduction because a lot was shot at high iso (for me and my 50D, iso 1250 is a lot already)) but wasn’t happy with it. I’ve found reasonable settings for noise reduction, which makes the image a bit softer, but sharpening wouldn’t do anything to compensate let alone actually sharpen.
Using StarCluster for some heavy computing
Update: 2012-10-22
I recently put the scripts I used in a little github repo called starcluster-image-sharpener. It’s nothing big but it’ll get you started if needed. Go get them! ;)
Introduction
First a little context:
I’ve photographed a ton recently and on one occasion I screwed things up a bit and had a lot of slightly out of focus images. I’ve tinkered with the Canon Digital Photo Professional settings to find a good level of sharpening (after noise reduction because a lot was shot at high iso (for me and my 50D, iso 1250 is a lot already)) but wasn’t happy with it. I’ve found reasonable settings for noise reduction, which makes the image a bit softer, but sharpening wouldn’t do anything to compensate let alone actually sharpen.
Tag: Ec2
Using StarCluster for some heavy computing
Update: 2012-10-22
I recently put the scripts I used in a little github repo called starcluster-image-sharpener. It’s nothing big but it’ll get you started if needed. Go get them! ;)
Introduction
First a little context:
I’ve photographed a ton recently and on one occasion I screwed things up a bit and had a lot of slightly out of focus images. I’ve tinkered with the Canon Digital Photo Professional settings to find a good level of sharpening (after noise reduction because a lot was shot at high iso (for me and my 50D, iso 1250 is a lot already)) but wasn’t happy with it. I’ve found reasonable settings for noise reduction, which makes the image a bit softer, but sharpening wouldn’t do anything to compensate let alone actually sharpen.
Using StarCluster for some heavy computing
Update: 2012-10-22
I recently put the scripts I used in a little github repo called starcluster-image-sharpener. It’s nothing big but it’ll get you started if needed. Go get them! ;)
Introduction
First a little context:
I’ve photographed a ton recently and on one occasion I screwed things up a bit and had a lot of slightly out of focus images. I’ve tinkered with the Canon Digital Photo Professional settings to find a good level of sharpening (after noise reduction because a lot was shot at high iso (for me and my 50D, iso 1250 is a lot already)) but wasn’t happy with it. I’ve found reasonable settings for noise reduction, which makes the image a bit softer, but sharpening wouldn’t do anything to compensate let alone actually sharpen.
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Tag: Queueing
Using StarCluster for some heavy computing
Update: 2012-10-22
I recently put the scripts I used in a little github repo called starcluster-image-sharpener. It’s nothing big but it’ll get you started if needed. Go get them! ;)
Introduction
First a little context:
I’ve photographed a ton recently and on one occasion I screwed things up a bit and had a lot of slightly out of focus images. I’ve tinkered with the Canon Digital Photo Professional settings to find a good level of sharpening (after noise reduction because a lot was shot at high iso (for me and my 50D, iso 1250 is a lot already)) but wasn’t happy with it. I’ve found reasonable settings for noise reduction, which makes the image a bit softer, but sharpening wouldn’t do anything to compensate let alone actually sharpen.
Using StarCluster for some heavy computing
Update: 2012-10-22
I recently put the scripts I used in a little github repo called starcluster-image-sharpener. It’s nothing big but it’ll get you started if needed. Go get them! ;)
Introduction
First a little context:
I’ve photographed a ton recently and on one occasion I screwed things up a bit and had a lot of slightly out of focus images. I’ve tinkered with the Canon Digital Photo Professional settings to find a good level of sharpening (after noise reduction because a lot was shot at high iso (for me and my 50D, iso 1250 is a lot already)) but wasn’t happy with it. I’ve found reasonable settings for noise reduction, which makes the image a bit softer, but sharpening wouldn’t do anything to compensate let alone actually sharpen.
Tag: Starcluster
Using StarCluster for some heavy computing
Update: 2012-10-22
I recently put the scripts I used in a little github repo called starcluster-image-sharpener. It’s nothing big but it’ll get you started if needed. Go get them! ;)
Introduction
First a little context:
I’ve photographed a ton recently and on one occasion I screwed things up a bit and had a lot of slightly out of focus images. I’ve tinkered with the Canon Digital Photo Professional settings to find a good level of sharpening (after noise reduction because a lot was shot at high iso (for me and my 50D, iso 1250 is a lot already)) but wasn’t happy with it. I’ve found reasonable settings for noise reduction, which makes the image a bit softer, but sharpening wouldn’t do anything to compensate let alone actually sharpen.
Using StarCluster for some heavy computing
Update: 2012-10-22
I recently put the scripts I used in a little github repo called starcluster-image-sharpener. It’s nothing big but it’ll get you started if needed. Go get them! ;)
Introduction
First a little context:
I’ve photographed a ton recently and on one occasion I screwed things up a bit and had a lot of slightly out of focus images. I’ve tinkered with the Canon Digital Photo Professional settings to find a good level of sharpening (after noise reduction because a lot was shot at high iso (for me and my 50D, iso 1250 is a lot already)) but wasn’t happy with it. I’ve found reasonable settings for noise reduction, which makes the image a bit softer, but sharpening wouldn’t do anything to compensate let alone actually sharpen.
Tag: Amazon Web Services
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Tag: Email
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Tag: Iam
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Tag: Mail
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Tag: Postfix
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Tag: Relay
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Tag: Sasl
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Tag: Saslauth
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Tag: Sendmail
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Tag: Ses
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Tag: Simple Email Service
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Local postfix as relay to Amazon SES
Introduction
Alright, this is a quick guide for the impatient but otherwise experienced linux admin/hacker/hobbyist. Some past postfix experiences might be advantageous for general understanding and troubleshoooting.
Why would I want a local postfix and relay to another smtp anyway? Simple: When my application code needs to send an e-mail, there is an SMTP server ready to accept the e-mail from me. It will then take care of everything else like re-delivery, dealing with being grey-listed and many other things. Also, if connectivity to the SES SMTP happens to be interrupted it’s no big deal because here too, the local postfix will handle re-sending for me. Nice, huh?
Tag: Controller
Accessing Request Parameters in a Grails Domain Class
It’s not exactly elegant to work with request parameters in a domain class but it was necessary. I have a bunch of domain classes with “asMap” methods where they render themself into a map and cascade to other domain objects as needed. In the controller, the resulting map is given to the renderer and we get a nice json response.
So now I’ve changed some fields and in order to stay backwards compatible, I created a new apiKey (a parameter needed for all calls to my app) that distinguishes old and new clients.
Accessing Request Parameters in a Grails Domain Class
It’s not exactly elegant to work with request parameters in a domain class but it was necessary. I have a bunch of domain classes with “asMap” methods where they render themself into a map and cascade to other domain objects as needed. In the controller, the resulting map is given to the renderer and we get a nice json response.
So now I’ve changed some fields and in order to stay backwards compatible, I created a new apiKey (a parameter needed for all calls to my app) that distinguishes old and new clients.
Tag: Domainclass
Accessing Request Parameters in a Grails Domain Class
It’s not exactly elegant to work with request parameters in a domain class but it was necessary. I have a bunch of domain classes with “asMap” methods where they render themself into a map and cascade to other domain objects as needed. In the controller, the resulting map is given to the renderer and we get a nice json response.
So now I’ve changed some fields and in order to stay backwards compatible, I created a new apiKey (a parameter needed for all calls to my app) that distinguishes old and new clients.
Accessing Request Parameters in a Grails Domain Class
It’s not exactly elegant to work with request parameters in a domain class but it was necessary. I have a bunch of domain classes with “asMap” methods where they render themself into a map and cascade to other domain objects as needed. In the controller, the resulting map is given to the renderer and we get a nice json response.
So now I’ve changed some fields and in order to stay backwards compatible, I created a new apiKey (a parameter needed for all calls to my app) that distinguishes old and new clients.
Tag: Grails
Accessing Request Parameters in a Grails Domain Class
It’s not exactly elegant to work with request parameters in a domain class but it was necessary. I have a bunch of domain classes with “asMap” methods where they render themself into a map and cascade to other domain objects as needed. In the controller, the resulting map is given to the renderer and we get a nice json response.
So now I’ve changed some fields and in order to stay backwards compatible, I created a new apiKey (a parameter needed for all calls to my app) that distinguishes old and new clients.
Accessing Request Parameters in a Grails Domain Class
It’s not exactly elegant to work with request parameters in a domain class but it was necessary. I have a bunch of domain classes with “asMap” methods where they render themself into a map and cascade to other domain objects as needed. In the controller, the resulting map is given to the renderer and we get a nice json response.
So now I’ve changed some fields and in order to stay backwards compatible, I created a new apiKey (a parameter needed for all calls to my app) that distinguishes old and new clients.
Tag: Param
Accessing Request Parameters in a Grails Domain Class
It’s not exactly elegant to work with request parameters in a domain class but it was necessary. I have a bunch of domain classes with “asMap” methods where they render themself into a map and cascade to other domain objects as needed. In the controller, the resulting map is given to the renderer and we get a nice json response.
So now I’ve changed some fields and in order to stay backwards compatible, I created a new apiKey (a parameter needed for all calls to my app) that distinguishes old and new clients.
Accessing Request Parameters in a Grails Domain Class
It’s not exactly elegant to work with request parameters in a domain class but it was necessary. I have a bunch of domain classes with “asMap” methods where they render themself into a map and cascade to other domain objects as needed. In the controller, the resulting map is given to the renderer and we get a nice json response.
So now I’ve changed some fields and in order to stay backwards compatible, I created a new apiKey (a parameter needed for all calls to my app) that distinguishes old and new clients.
Tag: Params
Accessing Request Parameters in a Grails Domain Class
It’s not exactly elegant to work with request parameters in a domain class but it was necessary. I have a bunch of domain classes with “asMap” methods where they render themself into a map and cascade to other domain objects as needed. In the controller, the resulting map is given to the renderer and we get a nice json response.
So now I’ve changed some fields and in order to stay backwards compatible, I created a new apiKey (a parameter needed for all calls to my app) that distinguishes old and new clients.
Accessing Request Parameters in a Grails Domain Class
It’s not exactly elegant to work with request parameters in a domain class but it was necessary. I have a bunch of domain classes with “asMap” methods where they render themself into a map and cascade to other domain objects as needed. In the controller, the resulting map is given to the renderer and we get a nice json response.
So now I’ve changed some fields and in order to stay backwards compatible, I created a new apiKey (a parameter needed for all calls to my app) that distinguishes old and new clients.
Tag: Request
Accessing Request Parameters in a Grails Domain Class
It’s not exactly elegant to work with request parameters in a domain class but it was necessary. I have a bunch of domain classes with “asMap” methods where they render themself into a map and cascade to other domain objects as needed. In the controller, the resulting map is given to the renderer and we get a nice json response.
So now I’ve changed some fields and in order to stay backwards compatible, I created a new apiKey (a parameter needed for all calls to my app) that distinguishes old and new clients.
Accessing Request Parameters in a Grails Domain Class
It’s not exactly elegant to work with request parameters in a domain class but it was necessary. I have a bunch of domain classes with “asMap” methods where they render themself into a map and cascade to other domain objects as needed. In the controller, the resulting map is given to the renderer and we get a nice json response.
So now I’ve changed some fields and in order to stay backwards compatible, I created a new apiKey (a parameter needed for all calls to my app) that distinguishes old and new clients.
Tag: Crash
GMail Backup
Considering that Google just lost tons of their user’s E-Mails, you might want to do a GMail Backup. Also, let’s see how this will develop ;-)
Tag: Gmail
GMail Backup
Considering that Google just lost tons of their user’s E-Mails, you might want to do a GMail Backup. Also, let’s see how this will develop ;-)
Tag: Google
GMail Backup
Considering that Google just lost tons of their user’s E-Mails, you might want to do a GMail Backup. Also, let’s see how this will develop ;-)
Tag: Lol
GMail Backup
Considering that Google just lost tons of their user’s E-Mails, you might want to do a GMail Backup. Also, let’s see how this will develop ;-)
Tag: Rsync
The simplest backup solution ever...
… not for everyone maybe, but for me as a linux/java/groovy person it is!
you’ll need
- java
- groovy
- rsync
- ssh
- cron
I want to regularly copy files from a bunch of paths from several remote hosts to the localhost. I know nothing about arrays etc in bash to configure this stuff in a simple way but the solution I’ve come up with is simple and elegant:
I wrote a little groovy script that generates the shell commands to be executed. the output of the groovy script is piped into bash, which executes the commands. That call of the groovy script with the piping to the bash is in a little shell script which is called from cron.