IBM Websphere MQ manipulations – part II

In my previous article IBM Websphere MQ manipulations – part I, I provided an example on how to put and get messages into the queue.  What we were doing was essentially performing a stateless transfer of information between systems.  The data itself is important but it is not related to any of the other data transferred and thus perhaps one of the simplest middleware examples.

IBM Websphere has undoubtedly changed over time, in version 7 they have added properties and tried to make it more compatible with Java Message Service API.   They also added some additional classes to make it easier to add or manipulate headers for the messages you are sending.

MQ Version 7

A message with a RFH2 header

import java.io.IOException;

import com.ibm.mq.MQC;
import com.ibm.mq.MQEnvironment;
import com.ibm.mq.MQException;
import com.ibm.mq.MQMessage;
import com.ibm.mq.MQQueue;
import com.ibm.mq.MQQueueManager;
import com.ibm.mq.headers.MQDataException;
import com.ibm.mq.headers.MQHeaderIterator;
import com.ibm.mq.headers.MQHeaderList;
import com.ibm.mq.headers.MQRFH2;

public class myMQ7 {


	String mqHostName;
	String mqQueueManagerName;
	String mqQueueChannel;
	String mqQueueName;
	int mqQueuePort;

	String filenametoput;
	String inputpath;

	private MQQueueManager mqQueueManager; // for QMGR object
	private MQQueue mqQueue; // for Queue object


	private void displayException(MQException ex, String action)
	{
		System.out.println("Error while " + action);
		System.out.println("QMGR Name : " + mqQueueManagerName);
		System.out.println("Queue Name : " + mqQueueName);
		System.out.println("CC   : " + ex.completionCode);
		System.out.println("RC   : " + ex.reasonCode);
	}

	public void init(String host, String managername, String channel, String queuename, int queueport)
	{
		mqHostName 			= host;
		mqQueueManagerName 	= managername;
		mqQueueChannel  	= channel;
		mqQueueName    		= queuename;
		mqQueuePort     	= queueport;

		// validity checking left off.
	}

	public void connect()  
	{ 
		try {
			MQEnvironment.hostname = mqHostName;
			MQEnvironment.channel = mqQueueChannel;
			MQEnvironment.port = mqQueuePort;

			mqQueueManager = new MQQueueManager(mqQueueManagerName);
		} 
		catch (MQException mqExp) 
		{
			displayException(mqExp,"doing queue manager connect");
			System.exit(1);
		}
	}
	public void disconnect()  
	{ 													 
		try {
			mqQueueManager.disconnect();
		} 
		catch (MQException mqExp) 
		{
			displayException(mqExp,"doing queue manager disconnect");
			System.exit(1);
		}
	}

	public void open()
	{
		int openOption = MQC.MQOO_OUTPUT | MQC.MQOO_INPUT_AS_Q_DEF;

		try {
			mqQueue = mqQueueManager.accessQueue(mqQueueName, openOption, null, null, null);
		} 
		catch (MQException e) 		
		{
			displayException(e,"doing queue open");
			System.exit(1);
		}
	}

	public void close()  
	{
		try {
			mqQueue.close();
		} 
		catch (MQException mqExp) 
		{
			displayException(mqExp,"closing queue");
			System.exit(1);
		}
	}

	public void putHeaderMessage(String args[])
	{
		init(args[0],args[1],args[2],args[3],Integer.parseInt(args[4]));
		connect();
		open();
		putMessage("Aw, the poor puddy tat!");
		close();
		disconnect();
	}
	
	public void getHeaderMessage(String args[])
	{
		init(args[0],args[1],args[2],args[3],Integer.parseInt(args[4]));
		connect();
		open();
		getMessage("favcolor");
		close();
		disconnect();
	}

	private void putMessage(String messageTextToSend) 
	{
		try {
			// create message
			MQMessage mqm = new MQMessage();
			//mqm.format = MQC.MQFMT_STRING;		// if you do this, your header becomes part of the message
			mqm.format = MQC.MQFMT_RF_HEADER_2;		// if you do this, your header is known to websphere and 
			// you can get properties off message later.

			//
			// do our header stuff
			//
			MQHeaderList list = new MQHeaderList ();

			// create a header type MQRFH2
			MQRFH2 myrfh2 = new MQRFH2();

			// add the codebase value to the user folder 
			// yes, it is case sensitive.

			try {
				myrfh2.setFieldValue("usr", "favcolor", "yellow");
			} catch (IOException e1) {
				// TODO Auto-generated catch block
				e1.printStackTrace();
			}

			// Add the header to the list of headers (the MQRFH2 I have created) 
			list.add(myrfh2);

			// Add all headers on the list to the message 
			// in this case only one(the MQRFH2 I have created) 
			try {
				list.write(mqm);
			} catch (IOException e1) {
				// TODO Auto-generated catch block
				e1.printStackTrace();
			}

			//
			// read in our file and add to message
			//
			byte[] bytearray = messageTextToSend.getBytes();
			mqm.write(bytearray);

			// send it out.
			//dumpMessage(mqm);
			mqQueue.put(mqm);

			System.out.println("Message sent");
		} 
		catch (MQException ex) 
		{
			displayException(ex,"sending header message");
			System.exit(1);		
		} 
		catch (IOException ex) 
		{
			System.out.println("sending message, write byte array error");
			System.exit(1);		
		}
	}


	private String getMessage(String propertyname)   
	{
		String returnMessage = "";
		String codeLocation = "";
		String propertyvalue = "";

		try {
			MQMessage mqm = new MQMessage();

			// read the message
			mqQueue.get(mqm);
			codeLocation = "get message length";
			int mLen = mqm.getMessageLength();
			System.out.println("Got message, all " + mLen + " bytes");

			// get data from the MQMessage object
			byte[] binMessage = new byte[mLen];

			MQHeaderIterator iter = new MQHeaderIterator(mqm);
			try {
				codeLocation = "skip headers";
				iter.skipHeaders();

				// this is the actual message length without
				// headers and stuff.
				codeLocation = "get data length";
				binMessage = new byte[mqm.getDataLength()];
			} 
			catch (MQDataException e) 
			{
				// TODO Auto-generated catch block
				e.printStackTrace();
			}
			codeLocation = "do readFully";
			mqm.readFully(binMessage);

			// see if our property exists on the message
			try {
				propertyvalue = (String)mqm.getObjectProperty(propertyname);
			}
			catch (MQException mqExp) 
			{
				if (mqExp.reasonCode == 2471)
				{
					// well, guess that property didn't really exist, not that serious of an error
					propertyvalue = "not found";
				}
				else
				{
					mqExp.printStackTrace();
				}
			}

			MQHeaderList headersfoundlist = null;
			try {
				headersfoundlist = new MQHeaderList (mqm);

				System.out.println("headers found in list = " + headersfoundlist.size());
				System.out.println("headers list empty? " + headersfoundlist.isEmpty());

				if (headersfoundlist.size() != 0)
				{
					propertyvalue = "no property found";
					int idx = headersfoundlist.indexOf("MQRFH2");
					MQRFH2 rfh = (MQRFH2) headersfoundlist.get(idx);
					try {
						propertyvalue = (String)rfh.getFieldValue("usr",propertyname);
					} 
					catch (IOException e) 
					{
						// TODO Auto-generated catch block
						e.printStackTrace();
					}
				}
			} 
			catch (MQDataException e1) 
			{	
				// TODO Auto-generated catch block
				e1.printStackTrace();
			} 
			catch (IOException e1) 
			{
				// TODO Auto-generated catch block
				e1.printStackTrace();
			} 

			String foundMessage = new String(binMessage);
			System.out.println("message='" + foundMessage + "'");
			System.out.println("property name=" + propertyname);
			System.out.println("property value=" + propertyvalue);
		} 
		catch (MQException mqExp) 
		{
			if (mqExp.reasonCode == 2033)
			{
				// queue empty, not really an error
				System.out.println("no message found");
			}
			else
			{
				displayException(mqExp,"reading from queue");
				System.exit(1);
			}
		} 
		catch (IOException e2) 
		{ 
			System.out.println("IO Exception while " + codeLocation);
			System.exit(1);
		}

		return returnMessage;
	}

	public static void main(String[] args) 
	{
		myMQ7 myputter;
		myputter = new myMQ7();
		myputter.putHeaderMessage(args);
		myputter.getHeaderMessage(args);
	}
}
Posted in programming, Setup From Scratch | Tagged , | Comments Off on IBM Websphere MQ manipulations – part II

IBM Websphere MQ manipulations – part I

Quite some years ago on a project that I was involved with, I was exposed to Websphere MQ.  Our needs were actually very basic to put it mildly.  We were using it as a secure method for transferring not messages but actually for transferring entire data files.  The files were not all that large and the only requirement was that they were sent with certainty.

We did not need to set special message id’s or any special attributes.  The data was almost always the same type and we used a naming convention for the data files.  The files were zipped before they were sent and thus the names and other attributes were preserved.

Management didn’t tell us if this would be difficult or not, they just asked us to get it done.  The Internet proved to be an excellent research tool.  There are quite a few examples for doing some simple queue manipulations here, here and here just to name a few.

What exactly is Websphere MQ

The product emphasizes reliability and robustness of message traffic, and ensures that a message should never be lost when MQ is appropriately configured.  Websphere MQ is a multi-platform product that allows delivery of messages in either a homogeneous or heterogeneous computing environment.  The strength of solution is due to the guaranteed delivery of messages.

Every product either continues to change and transform or it stagnates and becomes irrelevant.  During the evolution of the product, IBM has added to the Java classes that can be used to help lighten the load of using their solution in non-trivial ways.

From release to release IBM most likely does a lot of changes to existing classes or adds new ones to support new functionality.  I don’t have a full overview of all the improvements but the one thing that did come up was the new support for header records.  I will discuss more about that in a subsequent article about MQ.

IBM MQ vs JMS

Websphere MQ is an example of a message oriented middleware (MOM) system.  In this case, it is a middleware which allows for the transmission of message data between systems.  In MQ, the two important pieces that are required are the messages and the queues to store them in.  IBM has decided to add their MOM system the “queue manager” as part of their particular solution, this doesn’t exist by other solutions.

Java Message System (JMS) was created to allow Java programs to access existing message systems.  However, due to network effects, more and more message oriented middleware systems have also implemented JMS specification.  Below is just a small sample of implemented systems accessible via JMS.

  • Apache MQ
  • Open Message Queue, from Oracle
  • OpenJMS, from The OpenJMS Group
  • Solace JMS from Solace Systems
  • SAP NetWeaver Process Integration
  • SonicMQ from Aurea Software
  • SwiftMQ
  • Tervela

A more comprehensive list can be found on the Internet, or at other sites such as wikipedia.

MQ Examples

Below is an example of dealing with messages and  queues. The put example will actually work for either version 6 or version 7 or newer of Websphere.  Considering how simple the actual task is they may work without any changes for some time to come.

MQ Put / Get Example

import java.io.IOException;

import com.ibm.mq.MQC;
import com.ibm.mq.MQEnvironment;
import com.ibm.mq.MQException;
import com.ibm.mq.MQMessage;
import com.ibm.mq.MQQueue;
import com.ibm.mq.MQQueueManager;

public class myMQ6 {

	String mqHostName;
	String mqQueueManagerName;
	String mqQueueChannel;
	String mqQueueName;
	int    mqQueuePort;

	private MQQueueManager mqQueueManager; // for QMGR object
	private MQQueue mqQueue; // for Queue object

	public myMQ6() {
	}

	private void displayException(MQException ex, String action)
	{
		System.out.println("Error while " + action);
		System.out.println("QMGR Name : " + mqQueueManagerName);
		System.out.println("Queue Name : " + mqQueueName);
		System.out.println("CC   : " + ex.completionCode);
		System.out.println("RC   : " + ex.reasonCode);
	}
	
	public void init(String host, String managername, String channel, String queuename, int queueport)
	{
		mqHostName 			= host;
		mqQueueManagerName 	= managername;
		mqQueueChannel  	= channel;
		mqQueueName    		= queuename;
		mqQueuePort     	= queueport;

		// validity checking left off.
	}
	public void connect() 
	{ 
		try {
			MQEnvironment.hostname = mqHostName;
			MQEnvironment.channel = mqQueueChannel;
			MQEnvironment.port = mqQueuePort;

			mqQueueManager = new MQQueueManager(mqQueueManagerName);
		} 
		catch (MQException mqExp) 
		{
			displayException(mqExp,"doing queue manager connect");
			System.exit(1);
		}
	}
	public void disconnect()  
	{ 													 
		try {
			mqQueueManager.disconnect();
		} 
		catch (MQException mqExp) 
		{
			displayException(mqExp,"doing queue manager disconnect");
			System.exit(1);
		}
	}
	public void open()
	{
		int openOption = MQC.MQOO_OUTPUT | MQC.MQOO_INPUT_AS_Q_DEF;

		try {
			mqQueue = mqQueueManager.accessQueue(mqQueueName, openOption, null, null, null);
		} 
		catch (MQException e) 		
		{
			displayException(e,"doing queue open");
			System.exit(1);
		}
	}
	public void close()  
	{
		try {
			mqQueue.close();
		} 
		catch (MQException mqExp) 
		{
			displayException(mqExp,"closing queue");
			System.exit(1);
		}
	}

	public void putSimpleMessage(String messageTextToSend) 
	{
		try {
			// create message
			MQMessage mqm = new MQMessage();
			mqm.format = MQC.MQFMT_STRING;

			// the byte array could be contents of a file, but we 
			// will keep this simple.
			byte[] bytearray = messageTextToSend.getBytes();

			mqm.write(bytearray);

			// send it out.
			mqQueue.put(mqm);

			System.out.println("Message sent");
		} 
		catch (MQException mqExp) 
		{
			displayException(mqExp,"sending message");
			System.exit(1);		
		} 
		catch (IOException e) 
		{
			System.out.println("sending message, write byte array error");
			System.exit(1);		
		}
	}

	private void getSimpleMessage()   
	{
		String spot = "";
		try {
			MQMessage mqm = new MQMessage();		

			// read the message
			mqQueue.get(mqm);
			spot = "get message length";
			int mLen = mqm.getMessageLength();
			System.out.println("Got message, all " + mLen + " bytes");

			// get data from the MQMessage object
			byte[] binMessage = new byte[mLen];

			mqm.readFully(binMessage);

			String messagetext = new String(binMessage);
			System.out.println("message='" + messagetext+ "'");
		} 
		catch (MQException mqExp) 
		{
			if (mqExp.reasonCode == 2033)
			{
				// queue empty, not really an error
				System.out.println("no message found");
			}
			else
			{
				displayException(mqExp,"reading from queue");
				System.exit(1);
			}
		} 
		catch (IOException e2) 
		{ 
			System.out.println("IO Exception while " + spot);
			System.exit(1);
		}
	}

	public void putSimple(String args[])
	{
		init(args[0],args[1],args[2],args[3],Integer.parseInt(args[4]));
		connect();
		open();
		putSimpleMessage("hello world!");
		close();
		disconnect();
	}

	public void getSimple(String args[])
	{
		init(args[0],args[1],args[2],args[3],Integer.parseInt(args[4]));
		connect();
		open();
		getSimpleMessage();
		close();
		disconnect();
	}

	public static void main(String[] args) 
	{
		myMQ6 myputter;
		myputter = new myMQ6();
		myputter.putSimple(args);
		
		myputter.getSimple(args);
	}
}

This was all interesting but extremely basic.  MQ gets more interesting when you can add some sorts of meta data to the messages themselves or start using the various headers that are supported.

The two IBM version 6 libraries that are needed are com.ibm.mq.jar and connect.jar.

I will discuss this further in part II in a few days.

Interesting References

www.scribd.com/doc/6811160/Websphere-MQ-Using-Java

 

Posted in programming, Setup From Scratch | Tagged , | Comments Off on IBM Websphere MQ manipulations – part I

command line fun – getting to the base of things

Just a short time back I received a mail from a colleague with an error message.  The question from him was about the error and more importantly was this my program.  I did find some code that generated the same message but it wasn’t installed.  Well, it appeared to be installed but it wasn’t being run.

I was really in the middle of another issue so I asked my colleague to take a closer look into it.  She actually found a slightly modified version of the code and it wasn’t installed in the normal location. She didn’t think that this was being run either as no log file was placed into the log directory.

Too clever by far.  The developer did a few small changes using the basename and dirname commands and leaving almost everything else the same.

These commands are essentially a ying and yang to each other.  The basename command will strip off the path and return only the command, while the dirname command will remove the program and return the path.

command argument result
basename /home/bob/listing.sh listing.sh
dirname /home/bob/listing.sh /home/bob

Changed both the input and output directories relative to the installation directory.  Using either basename or dirname is a rather exotic way of choosing where to get input or place output.  A much simpler choice would be to use the current working directory.

#!/bin/bash
LOGFILE=`pwd`/logfile.txt
echo $((5 + 6)) >$LOGFILE

Yet this won’t work unless you run it from the installation directory.  If you run this script from another directory, the output will end up there.  The same general fate will occur if you try and run the script from the crontab.

When running scripts from the crontab the script is run from the root directory.  So this script would probably not work as the output cannot be written to the root.  Thus the clever solution is the one that is necessary.

#!/bin/bash
LOGFILE=`dirname $0`/logfile.txt
echo $((5 + 6)) >$LOGFILE

Using the dirname command on the name of the script that is being run will then actually be the same effect as having a variable for the installation directory.

 

Posted in Command line, programming | Tagged , | Comments Off on command line fun – getting to the base of things

bash – its not a bug its a feature

It’s not a bug it’s a feature

It has been said so many times it goes beyond cliche.  It is really easy to try and explain away odd behaviors as some sort of esoteric feature

It took a while before I caught up on the pattern.  I have a habit of copying text between windows, usually in the form of a command.  Putty makes this really easy, simply highlight the text in one window and right click in another.

The reason I did the copy and paste in the first place was because the command was really long.  Imagine my surprise when I try and run it a second time only to discover it is not in my command history.

If you start your command with one or more spaces, then it will usually fail to show up in the command history.  It is the variable HISTCONTROL that defines if command begins with a space it should be omitted.  This is done when the value is defined as ignorespace. There are two other possible values for the HISTCONTROL variable, they are ignoredups and ignoreboth – the ignoredups does exactly what it says.  If you do the exact same command multiple times the duplicates will be suppressed.  The ignoreboth option simply combines both ignoredups and ignorespace.

The command history is stored in the file that is pointed to by the HISTFILE variable.

HISTFILE=/home/dock/.bash_history

The history file isn’t without limit, the number of commands that will be stored is defined by the HISTFILESIZE variable.

HISTFILESIZE=2000

It is possible to see the last few commands in a few ways.  If the command is really recent, simply use the up and down arrow to scroll through the commands.  When you find the one you are interested it is already typed, just press enter.  To list the last commands simply type history.

 973 ls
 974 ls -ltr
 975 sync
 976 cd /
 977 sync
 978 sudo init 0
 979 echo $HISTIGNORE
 980 ls
 981 histroy
 982 history

The history command will display the last HISTSIZE commands

HISTSIZE=100

This will show the last 100 or even last 1000 commands.  This is probably the easiest way to see the commands, but it is also possible to see the last few commands using the fc command (ie fc -l).

Starting a command with a space prevents it from showing up in the list of the previous commands, but it is possible to extend the filtering to other commands.  There is another variable HISTIGNORE which can be set to include additional commands.  The variable is just a colon delimited list of commands.

HISTIGNORE='ssh:fortune:compress'

This example will suppress any commands that begin with ssh, fortune or compress.  This in itself may be enough, to keep those really embarrassing commands that contain passwords out of the history file.  It is possible to not just supress these commands but any line that contains one of those commands.

HISTIGNORE='*ssh*:fortune:compress'

Simply add the asterisk and then ssh will no longer be added to the command history. This is a shell pattern not a regular expression.  Only shell patterns will be used, not regular expressions.  I don’t have any specific filters that I need, simply adding an asterisk to my command.

HISTIGNORE="[0-9]*"

This filter works but it doesn’t do much that is meaningful.

It is possible to remove any of those embarassing commands by deleting them.  Simply get a list of the commands and then delete the one based on the index number.

> history
 1031 ls
 1032 history
 1033 pwd
 1034 cwd
 1035 cd
 1036 history
>
> history -d 1034
> history
 1031 ls
 1032 history
 1033 pwd
 1034 cd
 1035 history
>

Actually it is a feature, not a bug.

Posted in Command line | Comments Off on bash – its not a bug its a feature

Advertising versus reality (eating in the 21st century)

It was mothers day 2016 and so the entire family wanted to go out and celebrate with a fancy brunch. We packed up the whole family, make arrangements of when and where to meet and off we went.  We were so organized that we showed up at the hotel before my inlaws and that gave me time to wait impatiently compose myself for the upcoming brunch. It also gave me time to mill around the lobby and find the current copy of the “Frankfurt daily”.  I have no idea if this magazine is really published 365 days a year but in this particular issue was articles about the IFFA which is apparently a trade fair for all things in the food preparation industry.

I had no idea just what kinds of articles would be covered.  When I think of food, I guess I think of the advertising that is all over the television.  I am a bit of a skeptic and when they show how a particular brand of chocolate is being poured into a tiny mold by a clean cut person in a lab coat.  I have a chuckle when I see a family in a small village and how they eat their evening meal with a traditional meat spread, which coincidentally also is made in a local butcher shop.

No the reality of food processing is a much different picture. The food doesn’t come in the door in a small hand held basket but rather in semi-trailers full of livestock or vegetables and they are processed with modern assembly lines into can of corn, the steak or spare ribs.

From what I have read in this paper, the IFFA is all about meat and meat processing.  I suppose these machines are pretty exciting labor saving devices if you are taking big pieces of cow and turning out hamburgers, but they didn’t do much for me.

I brought home this paper and was trying to decide which of the impersonal machines should I add to this article to reinforce that food preparation is not what you see in the advertising. Before I could find a good picture, I saw the following comments on the messe frankfurt site.

IFFA is a trade fair, open to trade visitors, access is not available to private visitors.

On the whole exhibit ground photographing, filming and carrying of cameras is only allowed with the approval of the Messe Frankfurt.

This makes it sound like either the Messe Frankfurt or the IFFA is not real keen to have people see what kind of machines, treatments or casing are being used in their food.

To get around any problems with either the IFFA or the Frankfurt Messe, I have a few links to some of the assembly line machines.  To be honest, they do look to be amazing works of engineering that might look just as at home in a clean room of a computer chip company.

These links bring you to the machines and in some cases even include video’s showing their operation.  I am guessing that some organizations are very conscious of their public view and do not include any machines, nor much information.  Instead they allow you to fill out a form and receive information about their products (i guess they are chicken).

I was a bit surprised at some of the high tech products that are being used by the food processing industry.  One of them gave me a flashback to high school, as the machine that is being used to chop up my food, appears to be similar to the band-saws that we used in shop class.  A runner up favorite tool appeared to be a common handsaw.  Upon seeing this I kept thinking about the saw my father used to cut down the Christmas tree each December.  I guess the final tool appeared to a slight modification to the common knife.

I am not a vegetarian but looking at these machines really took the fun out of my mother’s day brunch.  Eventually, I will forget about them and go back to my normal 21st century food consumer but in the meantime I keep thinking of the slicer machine.

Posted in Soapbox | Tagged | Comments Off on Advertising versus reality (eating in the 21st century)

how much are your friends worth?

I sometimes wonder why exactly Google or Microsoft, or well their search engines are interested in my queries about cats, recipes or Java syntax interests.  I suppose the cookies that are left on my hard disk while shopping are interesting to someone but mainly things I have bought in the past.

I suppose that these things when put together might help some company to sell me either cat collars or Java Cookbook, 3rd Edition.  Yet, this types of information must be some pretty valuable information based on the recent announcement that WhatsApp users will no longer need to pay their yearly 99 cents subscription.

Facebook is probably not discontinuing the income stream because it has already paid off the 19 billion they paid to acquire WhatsApp but more likely a ploy to increase their user base.  Just like any good supply / demand situation, lower the cost to increase the number of users.  This along with network effects should help to create a bigger group and more valuable group.

How valuable is the group?  Well, if Facebook is forgoing this fee then the group must be worth at least the 99 cents per person on average but more likely they are trying to get at least 20% on top of that.  Big companies are not so interested in tackling the opportunities yielding 1% more profit unless there are no other options.

Whatsapp userbase estimates

Users  Est. Value
Apr 2013 200,000 240,000
Jan 2014 430,000 516,000
Jan 2015 700,000 840,000
Feb 2016 1,000,000 1,200,000

( group size estimates )

Whatsapp seems to be trying very hard to only collect the minimum amount of information which mainly comes down to the mobile number.  However, by signing up with Whatsapp you are giving over the permission for them to periodically troll through your address book to find other people you might be connected with.

On the surface this does seem quite reasonable as it does allow you to find others that you may already know, but it also allows a multinational the ability to track groups of people that are all somehow associated and keep those associations up to date.

Using Whatsapp doesn’t directly get any information about any of these users or groups.  Yet if any of these people use other related Whatsapp services such as Facebook some crumbs of information may be be collected and group behavior will be correlated at the corporate level.

The growth of the WhatsApp user base has seen significant growth since they were purchased by Facebook.  Is the plan to expand the group even further because of their eliminating the monthly fee?  It is also possible that the growth of WhatsApp users is tapering off and would peak in a couple of years at the current rate.  Another possibility is that there already some plans for monetizing these groups.

In general I get a bit nervous when I see that any of my “Status Submissions” are being handed over to the company to be used in pretty much any manner that they see fit.

5. User Status Submissions

… you hereby grant WhatsApp a worldwide, non-exclusive, royalty-free, sublicenseable and transferable license to use, reproduce, distribute, prepare derivative works of, display, and perform the Status Submissions in connection with the WhatsApp Service…

Ok, Status Submissions is not the same thing as Whatsapp is reading my messages but their language is worrying to me nonetheless.

Bankruptcy

It is silly to consider that Whatsapp or its parent Facebook would go bankrupt but that possibility has already been considered.  In fact should they go bankrupt, be sold, merged, acquire something or even some other “change control” occur they have already determined that they own the data about you and that what to do with your data.

In the Event of Merger, Sale, or Bankruptcy

… we may not be able to control how your personal information is treated, transferred, or used.

Sometimes Europeans feel a bit better about themselves due to superior data protection laws, but those laws do not have any real impact if they are being signed away.

Special Note to International Users

… you are transferring your personal information to the United States and you expressly consent to that transfer and consent to be governed by California law for these purposes.

“When something online is free, if you’re not the customer, you’re the product.”

http://blogs.law.harvard.edu/futureoftheinternet/2012/03/21/meme-patrol-when-something-online-is-free-youre-not-the-customer-youre-the-product/

Whatsapp Legal notice

https://www.whatsapp.com/legal/

Posted in Soapbox | Tagged | Comments Off on how much are your friends worth?

command line fun – listing and sorting

Linux is not Unix yet it is very similar.  They are so similar I never really notice the difference, even when writing scripts.  I needed to get a list of sorted files, so I tried the usual commands.

ls -lS

However, there is a small difference for sorting files by size, well in Solaris.  I was able to come up with a solution, but it was a bit more convoluted.

ls -lS | sort -n 

This actually worked just fine, despite how odd it looked to me.  However, the sort command can be used to do a lot more than just sorting file sizes.

Most of the time I use the sort command for the simplest of uses.  Usually it is to take a list of values and produce a sorted list, usually to be used later as a form of input.

ls -l | nawk '{print ($6 " " $7 " " $8)}' | sort

May 30 2012
May 30 2012
May 30 2012
May 30 2012
May 30 2012
May 30 2012
May 30 2012
May 30 2012
May 31 2012
May 31 2012
May 31 2012
Sep 27 2011
Sep 27 2011
Sep 27 2011
Sep 27 2011
Sep 30 2011

It depends on the actual input, in this case the values don’t produce a unique list of dates.  It is possible to have some sort of control break logic to deal with changing data, but this is actually a bit heavy programming considering this is shell scripting. It is actually easier to filter out these values while sorting.

This filtering can be done by passing the “-u” command to the sort command.  This causes the control break logic to be executed by sort and have it remove the duplicates.

ls -l | nawk '{print ($6 " " $7 " " $8)}' | sort -u
  
May 30 2012
May 31 2012
Sep 27 2011
Sep 30 2011

This works great depending on the type of input, but this example doesn’t work quite well with numeric values.

ls -l | nawk '{print ($5 )}' | sort 

13752
1423360
2527
2918400
4096
4096
4096
4096
4096
4096
4096
47
55
577
616
686080

The sorted list doesn’t make all that much sense, unless you consider that the list has been sorted as alpha numeric data.  The values that begin with ‘1’ are listed before those starting with ‘9’ despite the entire number having a much different value.

This problem was obviously foreseen as it is possible to use the “-n” parameter to have the sorted values to be treated as numeric instead of alpha numeric.

ls -l | nawk '{print ($5 )}' | sort -n 

47
55
577
616
2527
4096
4096
4096
4096
4096
4096
4096
13752
686080
1423360
2918400

This covers most of the situations that I have needed the sort command for, but this just scrapes the surface of how flexible this command is.  It is possible to sort not only simple lists but delimeted lists.  The sorting can be done for one or more columns and the values can be reversed if necessary.

sort argument Description
-u only unique values
-n treat values as numeric
-r sorted values in reverse order
-t field delimeter
This is used when the input contains multiple fields.
-k<#> when multiple fields are part of the input,
the field “#” will be the field that is sorted.It is possible to add multiple “-k” parameters to sort on multiple fields.  The fields will be sorted based on the order that they show up on the command line.(-k2 -k1 will sort first on field 2 within field 2 will sort on field 1)
Posted in Command line | Tagged , | Comments Off on command line fun – listing and sorting

Oracle and batch reporting

My first real exposure to a real database was Sybase.  It actually is a fairly nice “little” database that you can come to grips with.  I mean little when compared to behemoth that is Oracle.  I don’t want to go against the underdog, but it is pretty amazing the stuff that Oracle packs into their solution.  In addition to a relational database with all the friendly editors for creating tables and procedures, they have tools for the command line junkie as well.

How I always end up back at the command prompt is a mystery to me.  Yet, despite finding Oracle to be a pretty large system they did provide a simple little command line tool which can be used for querying data from the database.

This little command line tool is called sqlplus.  With this, I can extract some data from the database and manipulate it myself and then load it right back in.  The task tends to be a small one off, so there is not much point in writing a complicated program using JDBC.

As Sqlplus is a command line tool it is just as plain as you might imagine.  When you log in, you are simply left at a prompt to enter your SQL commands.  There are no GUI’s or wizards to help out, but either by design or accident those fine folks at Oracle tucked in quite a few formatting options to make this an excellent tool not only for performing queries but also for extracting data or even producing small reports.

Setup

At the end of the article is a list of all the scripts used to create these tables along with populating them.

Starting sqlplus is trivial if your environment is already properly setup.  Sorry, setting up Oracle is beyond the scope of this article.

sqlplus user/password@tnsid

<disclaimer>This is not even remotely safe to put the password into a script as some hacker or disgruntled employee will find the user and password information and undoubtedly do naughty things</disclaimer> This is not much better if used on the command prompt it could also be found by someone examining the password history.  That isn’t an important issue for a developer on a development machine just to keep in mind for production.

Command Line

More than just a simple command line tool, this one can also be used in scripts.  Simply run sqlplus but also pass the name of a parameter file to run as its argument.

sqlplus user/password@tnsid  @mysqlquery.sql

Sqlplus will take this file as a list of input and begin to process each line in the file as if you were typing it.  Don’t forget to add a “quit” statement as the last line of your script otherwise, sqlplus will stay at the prompt after running the script.

Nobody is perfect, not even the defaults that were chosen by oracle for outputting query data.

Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production
With the Partitioning, Real Application Clusters, Automatic Storage Management, OLAP,
Data Mining and Real Application Testing options

SQL> SELECT * FROM dbo.conair_emp;

FIRST
----------------------------------------
LAST                                             IQ
---------------------------------------- ----------
EMAIL
--------------------------------------------------------------------------------
max
musterman                                        98
max@musterman.com

mathew
doe                                             115
mathew@doe.com

FIRST
----------------------------------------
LAST                                             IQ
---------------------------------------- ----------
EMAIL
--------------------------------------------------------------------------------

mark
doe                                             115
mark@doe.com

luke
doe                                             115

It is good that there are options for doing a bit of formatting the raw query data because the default values that are chosen by default are pretty hideous.

When changing the default output format it is actually possible to create simple reports at the same time.

Simple reporting

The most important part of any report, formal or informal is some way to save the actual report data.  It is possible to run the sqlplus command for a given sql statement but unless the output is redirected to a file, it will be lost forever.  So, perhaps the most important statement is how to redirect the output to a file.  This is done using the spool command.  us has the spool command which when activated will redirect the output to the filename given in the current directory.

Command Parameter Description
spool <filename> Puts all screen output into the file named <filename>
spool off stops logging output to the spool file.

Never forget to turn off your spooling or otherwise your entire interactive session will end up in your output.

Beyond saving the output, we need to get it out of the database in a format that is not the ugly default format of sqlplus.  The simplest and most effective way to make the output more friendly is to change the format for each field of the report.  This is done by using the “column” command.

When using A32 or 999 is mainly just setting a length for the fields that will be displayed.  There is no real formatting take place.

Command Parameter Description
column format The actual column format to use
column format A32 Formats the output as a alpha numeric field 32 characters long.

That doesn’t mean it is not possible for some simple formatting of numeric values.  The comma and period can be used to format both larger numbers or fractional values to display in a more user friendly manner.

Command Parameter Description
column format 999.99 Formats the output as an floating value that will take up at most three columns (ie 0 – 999) with at most two decimal places
column format 9,999.99 Formats the output as a floating point value that is at most four digits with two decimal places.  This will also put a comma in for the thousands separator.

It might be possible to do this with European formatting but I didn’t find it necessary to do that and so I don’t have any further information on how do it.

Thus by simply adding formatting to the fields, the report becomes instantly friendly.

column first format A15
column last format A15
column iq format 9,999.99
column email format A32

spool report1.rpt
select * from conair_emp;
spool of

The information no longer wraps on the screen, the columns are not much wider than the actual information and everything looks pretty tidy.

FIRST           LAST            IQ        EMAIL
--------------- --------------- --------- --------------------------------
max             musterman           98.00 max@musterman.com
mathew          doe              2,115.00 mathew@doe.com
mark            doe              1,515.00 mark@doe.com
luke            doe                115.00 luke@doe.com
john            doe                978.00 john@mydomain.com
john            lennon             115.00 johnlennon@beatles.com
paul            mccartney        9,115.00 paulmccartney@beatles.com
george          harrison           115.00 georgeharrison@beatles.com
ringo           star               115.00 ringostar@beatles.com

9 rows selected 

This is by no means everything that can be done for the different columns of our report.  It is also possible to change the headings at the same time we are changing the formats.  This might be not as necessary in this particular report as the column names pretty much match the values that are selected.

column first heading 'First name' format A15
column last  heading 'Last name'  format A15
column iq    heading 'Hourly wage|in thousands' format 9,999.99
column email heading 'Email Addr' format A32

spool report1.rpt
select * from conair_emp;
spool of

Not every column name was quite perfect, this more a feature of the idiot person who actually created the database to begin with.

First name      Last name         Hourly wage  Email Addr 
                                  in thousands
--------------- ----------------- ------------ --------------------------------
max             musterman                98.00 max@musterman.com
mathew          doe                   2,115.00 mathew@doe.com
mark            doe                   1,515.00 mark@doe.com
luke            doe                     115.00 luke@doe.com
john            doe                     978.00 john@mydomain.com
john            lennon                  115.00 johnlennon@beatles.com
paul            mccartney             9,115.00 paulmccartney@beatles.com
george          harrison                115.00 georgeharrison@beatles.com
ringo           star                    115.00 ringostar@beatles.com

9 rows selected 

It is also possible to create multi-line descriptions for a column by using the pipe symbol to mark where the field name should be split.

If the report was just a very small report or used for some very casual internal purpose then this would probably be enough.  We have our data, each column is understandable and the important values are formatted in some manner.

This isn’t the limit of what can be done for reporting.  Just like most reports it is possible to add titles and footer information to the report.  this is done using the ttitle and btitle commands.  This report will actually be printed to paper so we change the length so it can be properly paginated.  This is done using the pagesize command.

set pagesize 20
set heading on 
set linesize 80
set trimspool on

ttitle center "company wage report " skip 2
btitle left "-----------------------------------------------------------------------------" skip 1 -
left "confidential information - internal " -
right "page " format 999 sql.pno

column first heading 'First name' format A15
column last  heading 'Last name'  format A15
column iq    heading 'Hourly wage|in thousands' format 9,999.99
column email heading 'Email Addr' format A32

spool report1.rpt
select * from conair_emp;
spool of

I have made the arbitrary decision that the page will only be 20 lines long as I don’t want to fill my blog up with a lot of blank lines.  however, this command will ensure that the page is the proper length and is padded to the end of the page.

                              company wage report

                                 Hourly wage
First name      Last name       in thousands Email Addr
--------------- --------------- ------------ --------------------------------
max             musterman              98.00 max@musterman.com
mathew          doe                 2,115.00 mathew@doe.com
mark            doe                 1,515.00 mark@doe.com
luke            doe                   115.00 luke@doe.com
john            doe                   978.00 john@mydomain.com
john            lennon                115.00 johnlennon@beatles.com
paul            mccartney           9,115.00 paulmccartney@beatles.com
george          harrison              115.00 georgeharrison@beatles.com
ringo           star                  115.00 ringostar@beatles.com



-----------------------------------------------------------------------------
confidential information - internal                                    page    1

9 rows selected.

There are two more commands that can be used depending on how big the report is expected to be.  The first is the “clear screen” command.  This will just clear the screen when it is encountered.  this is probably quite useful for some very tiny report that will give a few lines of information.  The cleared screen makes the information easy to find on your terminal session.

The second command is the “termout” command.  This when this command is set to off, then no more output will be displayed on the terminal screen.  This would be a good command when hundreds or thousands of lines of output are expected.

It is also possible to calculate totals and have breaks between types of data, but I will leave that for another time.

Extracting data

Actually most of the general reporting could be used in the extracting of data.  We might want to do some formatting of our data but more likely than not we might not be as interested in the headings.  It is possible to do a simple query and basically drop the headings and we have an output format.

Dropping the headings is done by setting the pagesize to zero.  This will then cause no page breaks and it will automatically suppress the headings.

max                     musterman                       98 max@musterman.com
mathew                  doe                           2115 mathew@doe.com
mark                    doe                           1515 mark@doe.com
luke                    doe                            115 luke@doe.com
john                    doe                            978 john@mydomain.com
john                    lennon                         115 johnlennon@beatles.com
paul                    mccartney                     9115 paulmccartney@beatles.com
george                  harrison                       115 georgeharrison@beatles.com
ringo                   star                           115 ringostar@beatles.com


9 rows selected.

This actually creates a very simple fixed length record that can be parsed by your program.  just break each line into its composite fields when extracting. I don’t know a lot of developers who are still interested in fixed length output data.  Sure you can parse it but why when you can have a comma separated file.

Here too, oracle helps.  it is possible to set the field separator.

set colsep ‘;’

This will add a semicolon to our output automatically turning it into csv data.

max                    ;musterman              ;        98;max@musterman.com
mathew                 ;doe                    ;      2115;mathew@doe.com
mark                   ;doe                    ;      1515;mark@doe.com
luke                   ;doe                    ;       115;luke@doe.com
john                   ;doe                    ;       978;john@mydomain.com
john                   ;lennon                 ;       115;johnlennon@beatles.com
paul                   ;mccartney              ;      9115;paulmccartney@beatles.com
george                 ;harrison               ;       115;georgeharrison@beatles.com
ringo                  ;star                   ;       115;ringostar@beatles.com

9 rows selected.

Yes, now our data has now been turned into the ugliest comma separated data ever.

Note: the semicolon is the csv file delimiter in excel in some countries.

Extra Credit

There is another way to run these scripts without adding a quit command.  Simply echo the password into the sqlplus command when running a script.  Sqlplus recognizes that it is in batch mode and quits when the commands are finished.

echo secretpass | sqlplus myuser@cdatabase @commands.sql

This might be marginally safer but not by much.  If this were a script, then your password is still in the script to be found by someone.

There is also a better way to create our csv file.  It requires a bit more effort when creating our script.  Simply concatenate each field that you wish to have into a single string.

SELECT first || ‘,’ || last|| ‘,’ || iq|| ‘,’ || email FROM dbo.conair_emp;

max       ,musterman ,98,max@musterman.com
mathew    ,doe       ,2115,mathew@doe.com
mark      ,doe       ,1515,mark@doe.com
luke      ,doe       ,115,luke@doe.com
john      ,doe       ,978,john@mydomain.com
john      ,lennon    ,115,johnlennon@beatles.com
paul      ,mccartney ,9115,paulmccartney@beatles.com
george    ,harrison  ,115,georgeharrison@beatles.com
ringo     ,star      ,115,ringostar@beatles.com

The data is really not much nicer to look at than our previous csv attempt but that is hardly the databases, nor even sqlplus’s fault.  If the table had been created with varchar fields instead of char fields, then the data would not contain the trailing spaces.

max,musterman,98,max@musterman.com
mathew,doe,2115,mathew@doe.com
mark,doe,1515,mark@doe.com
luke,doe,115,luke@doe.com
john,doe,978,john@mydomain.com
john,lennon,115,johnlennon@beatles.com
paul,mccartney,9115,paulmccartney@beatles.com
george,harrison,115,georgeharrison@beatles.com
ringo,star,115,ringostar@beatles.com

It isn’t the fact you can do a query from the command prompt that is so amazing but some of the options that you can turn on to turn a small query into a small report, or a small data extract.

 

Setup Scripts

create.sql

drop table conair_emp;
commit;
/
create table conair_emp
(
first char(10) not null,
last char(10) not null,
iq float not null,
email char(30) not null
);
commit;
/
prompt
prompt test table conair_emp created
describe conair_emp

 

fill.sql

begin
delete from conair_emp;
commit;
insert into conair_emp (first,last,iq,email) values('max','musterman',98,'max@musterman.com');

insert into conair_emp (first,last,iq,email) values('mathew','doe',2115,'mathew@doe.com');
insert into conair_emp (first,last,iq,email) values('mark','doe',1515,'mark@doe.com');
insert into conair_emp (first,last,iq,email) values('luke','doe',115,'luke@doe.com');
insert into conair_emp (first,last,iq,email) values('john','doe',978,'john@mydomain.com');

insert into conair_emp (first,last,iq,email) values('john','lennon',115,'johnlennon@beatles.com');
insert into conair_emp (first,last,iq,email) values('paul','mccartney',9115,'paulmccartney@beatles.com');
insert into conair_emp (first,last,iq,email) values('george','harrison',115,'georgeharrison@beatles.com');
insert into conair_emp (first,last,iq,email) values('ringo','star',115,'ringostar@beatles.com');
commit;
end;
/

 

report1.sql

clear screen

column first format A15
column last format A15
column iq format 9,999.99
column email format A32

set term on
spool report1.rpt
SELECT * FROM dbo.conair_emp;

spool off

 

report2.sql

clear screen

column first heading 'First name'  format A15
column last heading 'Last name'  format A15
column iq heading 'Hourly wage|in thousands'  format 9,999.99
column email heading 'Email Addr'  format A32

set term on
spool report2.rpt
SELECT * FROM dbo.conair_emp;

spool off

 

report3.sql

SET PAGESIZE 20
SET HEADING on

SET LINESIZE 80
set trimspool on
set termout off

ttitle center "company wage report " skip 2
btitle left "-----------------------------------------------------------------------------" skip 1 -
left "confidential information - internal " -
right "page " format 999 sql.pno

column first heading 'First name'  format A15
column last  heading 'Last name'  format A15
column iq    heading 'Hourly wage|in thousands'  format 9,999.99
column email heading 'Email Addr'  format A32

spool report3.rpt
SELECT * FROM dbo.conair_emp;
spool off

 

export1.sql

SET PAGESIZE 0
SET LINESIZE 999
set trimspool on

spool export1.out
SELECT * FROM dbo.conair_emp;
spool off

 

export2.sql

SET PAGESIZE 0
SET LINESIZE 999
set trimspool on
set colsep ';'

spool export2.out
SELECT * FROM dbo.conair_emp;
spool off

quit

 

export3.sql

SET PAGESIZE 0
SET LINESIZE 999
set trimspool on
set colsep ';'

spool export3.out
SELECT first || ',' || last|| ',' || iq|| ',' || email FROM dbo.conair_emp;
spool off

quit

 

Posted in Command line, programming | 3 Comments

command line fun – more fun with sftp

A few days back we needed to copy some files around in the production environment and the task was given to me.  It was one of those rather simple tasks as in the email was some examples from a colleague how I should use scp to get those files copied across.

I have to admit the syntax was a bit obscure but in general it seemed harmless enough.

sftp -o IdentityFile=id_rsa_somekey someuser@somemachine.ourdomain.com

Indeed if I ran that command from my terminal it worked like a champ, well it did after they copied the id_rsa_somekey to our ~user/.ssh directory.

This was a great way to connect to the other machine to manually put a file but not all that great for batch copying.  I discussed this with my colleague who agreed and suggested that we simply use scp instead – after all it is the same general protocol family.

So a script was born.  It was an awesome script.  It was a powerful script.

OPTIONS="-o IdentityFile=~batchuser/.ssh/id_rsa_somekey"
EXTSYS_USER=bob
EXTSYS_HOST=somemachine.ourdomain.com
EXTSYS_PUTDIR=/to_dropoff

scp $OPTIONS $1 $EXTSYS_USER@$EXTSYS_HOST:/$EXSYS_PUTDIR

Well, it at least did get the job done.

Well, this script actually didn’t get very far when Dave from accounting (wish it was accounting and not IT) decided rather than to fool around with using an internal service that works and has been paid for, we should simply connect directly to the vendor and put the file there instead.

It wasn’t the technical portion that was the issue for me but rather that John from support was asked to put this together instead of me.  I watch John struggle with this for quite a few hours until he realized that it is just not possible to use a key in this manner in a batch script when it is protected by a passphrase.

Things got a lot easier for John once he decided to remove the passphrase.

echo put $1 > batch_file
sftp -b batch_file -o IdentityFile=~batchuser/.ssh/id_rsa_key -o Port=20022 -o PreferredAuthentications=publickey externaluser@machine.someotherdomain.com

I guess I am a bit of a snob.  I think that the program directory should contain programs or at least static configuration files.  John decided that he would simply create the batch_file in the current working directory.

I may be a snob, but other than that, Johns script was pretty good.  I just kept wondering why he would do it this way instead of using secure copy.  I tried it and despite working on our internal machines it failed when I did it to our vendor’s machine.

exec request failed on channel 0

exec request failed on channel 0 

I did some research and it seems that the problem is more one of setup.  The destination machine was probably setup to accept sftp but not ssh.

Well, it turns out that the scp command line tool in OpenSSH is implemented using the secure copy protocol which is implemented more like running secure shell commands.  It is (apparently) possible to have sftp but not scp configured on your server.

I did learn a few things about sftp by watching John’s progress, but I did learn with certainty it is impossible to use a private key with a passphrase without entering it interactively.  There are other programs such as ssh-agent which can help out but that is a topic for another day.

Posted in Command line | Tagged , , | Comments Off on command line fun – more fun with sftp

Ex CEO has big plans for your tax money

It is a fairly perverse situation when a bunch of extremely wealthy business men ask the government to do things that they could do but are not willing to do so.  Well, in this case I am speaking about research and investment into skills and innovation that will help their companies (and everyone else) to prosper.

The reason I find it so perverse is that the companies are doing their best to lower their tax bills through a lot of legal tricks to save a lot of money.  Well, we shouldn’t be too surprised that companies do these types of maneuvers in order to minimize their tax bill.

What is the responsibility of a corporation?

Their principal and overriding responsibility is to shareholders and it is a responsibility to conduct the operations of the company in such a way as to maximize the wealth of these shareholders.

This actually means that the companies officers are not doing their job if they are not putting the shareholders at the top of the list of people/organizations to please.

Pro Tax minimization argument

To play the devils advocate in favor of corporations – if it is legal then this is actually a good thing.  The shareholders have invested their money and get a dividend (some of them do) or the value of the companies stock increases.  This money will later be spent on goods and services which and taxes will be paid on it.  The goods and services need to be provided by people so the people get jobs and the taxes get paid so the governments are happy.

This is completely true and good, but in the best possible case the tax money will be deferred to some point in the future, but the people who are providing the goods and services will be also earning, spending and paying taxes.

This is great but unless the companies themselves are doing a lot of research and development for the future, they are simply reaping now what was sown in the past.  It doesn’t sound so bad when you put it that way, but it is actually worse than that.

Money that is diverted through legal tricks is also money that does not get spent on local taxes for general infrastructure.  Unfortunately this infrastructure is really boring and so nobody really wants to talk about it.

  • roads
  • airports
  • power grids
  • schools
  • universities

When states or countries don’t have the money in the form of taxes the infrastructure starts to wear out but airports don’t disappear.  True, some maintenance cannot be avoided but with a smaller pool of tax money the ability to maintain is limited and problems accumulate or in the worse case can cause catastrophic problems like the bridge collapse in Minnesota.

If all of this wasn’t bad enough it gets worse.  It is only the big, very big and multi-national companies that can fool around with their earnings.

Microsoft holding money offshore to avoid taxes

Apple, Google and Microsoft holding money off shore to save taxes

Apple borrows money to save tax bill

Microsoft moving profits offshore to avoid taxes

No, that isn’t quite the end of it because small business cannot do the same tricks.  If you are a small grocery store, gas station, dry cleaner or other small business then everything you do is local.  Your money is both earned and taxed and that is the end of the story. Well, except those taxes are probably higher than they need to be because the playing field is not level – I am looking at you big corporations.

Perhaps that means we should just tax the big companies and do it in a really big way.

Pro Tax argument

It is possible to be the devils advocate on the tax-em all side of the argument.  The argument is a bit different.  If all companies and people pay a lot of taxes, then the infrastructure will be fabulous.

The problem is that enough is never quite enough, and if the coffers were full to overflowing with tax revenue then the politicians would be extremely eager to spend it in any way they see fit.

This is not accusing them of taking 100 dollars and going out and having dinner.  That might be a money better spent than the alternative.  The problem is that some people must really really like being a politician and will do virtually anything to keep the job.  The best way for a politician to keep their job, especially at the federal level, is to be continually bringing work, jobs and money back to their district.

The easiest way to get the money back to your district is for you and all of your politician friends to take parts of various large projects back home to their state.

F35 Fighter jet built in 45 US states

Rabbit massages

Building a bridge to nowhere

Well, if we somehow managed to keep the money out of the bridge programs and in general away from politicians, we could see that the public sector people received a decent wage.

At first glance this seems ok, after all, we want motivated people at the department of motor vehicles when we get our drivers license.  Who wouldn’t want to know that the people who are teaching our children are being properly paid.

But were would we draw the line, would the wages be so good that people would earn better wages in the public than private sector?  The problem is that this then becomes a form of income re-distribution which may or may not be fair to all levels of government. Even if it was fair, which tax payer would want to keep the government coffers full so public sector jobs would earn more than the people funding them.

Pro Responsible argument

Obviously this is a pretty complex issue that could not be solved over a few beers with friends in the bar.  Yet, both fairness and a wide net may be the answer.

It isn’t fair if “Joes Pizza” is paying all the required taxes when large companies simply move the intellectual property to a low tax country and move the profits there.  It also isn’t fair if the taxes are so high that a company cannot make a decent profit for their shareholders.

Whenever I read or hear some billionaire complaining that the the basic research isn’t being done by the government, I really hear something slightly different.

We want you, the government,  to spend more money on general research and investment while at the same time we want to use tax tricks to minimize how much money we give you to spend on general R&D and infrastructure.

On the other hand, he might be right.  Basic R&D has been proven time and time again to pay for itself, although sometimes differently than initially expected.

http://blogs.reuters.com/great-debate/2016/04/18/bill-gates-americas-secret-weapon/?utm_source=twitter

 

Posted in Soapbox | Comments Off on Ex CEO has big plans for your tax money