A Personal Rant on Trading Bitcoin

I would say the summary of my trading practices thus far can be summed up into, “I’m an idiot.”

It’s up. it’s down. Oh, my gosh I could have made $10k. Oh, no I’ve lost $300. Ack! What does everyone else think?… Just an average day in a Cryptocurrency trader’s life. But not for the above average trader’s life. Much of the same advice you’d get from a financial planner is applies to Bitcoin and other currencies.

Research the Company

Who are the developers of the Currency you’re thinking about buying? Who are their investors? Have they successfully launched a currency before? Have previous projects they worked on failed in a spectacular fashion?

There are a lot of motivations for attempting to create a new Altcoin. Notoriety, solving social or economic problems, or greed are some of the most popular themes of Cryptocurrency. Following the successes or failures of a development team will guide you in figuring out what motivates them. Don’t get stuck into thinking greed is bad motive. Several self-interested projects made a lot of money for the development team and early investors who knew when to sell.

Research the Product

The development teams are going to market their Cryptocurrency to garner investment interest, adoption, and higher trading prices. That makes it easy to find information like what problems are they trying to solve or new Blockchain technology are they trying to introduce? Are they trying to bring a solution to Apple products or Mobile devices where others aren’t?

Invest for the Long Term

If you’re full time job is staring at charts and day trading, you can still do that with cryptocurrencies. You just need to adjust to the increased volatility. By Volatility I mean 40% up or down in a day… 30 minutes even. But if your trading on the intraday bumps you might find a higher portion of your profits going to fees and splits. So, I say invest for the long term. If I had followed the advice in these sections, I’d have a lot more disposable income.

Personal Stories

I met a guy while working at Dell that told me a story of the $300,000 240 MB Hard drive he bought. Yes, MB. Well he cashed in some of his employee purchase plan stock for a new hard drive when the stock wasn’t worth all that much. At the time of telling me the story the Hard Drive was worthless and the amount of stock he sold for it was worth $300,000. Oh, how we laughed. And now I’ll relate for you the story of the guy who bought a pizza with Bitcoin when it was worth pennies and now that Bitcoin would be worth Millions. You’d think I’d learn from others, but I too have purchased a $700 tablet for $4,000 worth of Bitcoin at today’s prices.

But I think more disappointing are opportunities I missed due to fear.

Stratis is an Altcoin someone pointed out to me in December of 2016. The price was under $0.05. I thought well let’s wait and see what happens. The interesting thing about Stratis is the development team’s partnership with Microsoft and building their platform on the .NET framework. This means that the products a developer would write to interact with their Blockchain technology can run natively on Windows Operating Systems without a lot of additional translation code or “wrapper” code. The price went up to something over $0.07 and I said, “Ok I’ll buy some.” And invested $300. I woke up one morning a few weeks later and the price is over $0.30, and has been hovering between $0.40 and $0.50 the last two weeks. The currency had a lot of earmarks of something I thought was a good investment, and I kick myself for not putting in $1500 or more at the $0.07 price.

DASH, which was launched as Dark Coin, I used to mine. The name Dark Coin certainly sounded cool to the kids and it was marketed as the first truly anonymous currency because the network had the function called mixing where your bitcoin cold be split up and mixed with fractions from other Dark Coin on the network without additional entries in the blockchain thus removing the traceability of the transactions. When fintech investing in Blockchain technologies started becoming serious business they grew up and changed the name to DASH. I had mined 8 DARK when I had a hard drive failure and said well I won’t both with that currency any more. At that time, Dash was only worth around $1.00, so I was out maybe $10. At the same time Stratis had its big jump DASH goes to $100 and has stated above $50. Now why didn’t I keep mining when the difficulty was low and amass a vast fortune? I was able to restore my Dark Wallet from a backup and retrieve my 8+ DASH, but I could have had 100 over the course of that year.

Check out the stellar raise of PIVX. I looked at it when it was less than $0.03. It’s trading at $1.38 today… $1,500.00 would be worth over $100,000, and it happened extremely fast.

Stay tuned!

Wikipedia – Bitcoin

https://en.wikipedia.org/wiki/Bitcoin

Bitcoin Forum – The most popular place to discuss all Cryptocurrencies

https://bitcointalk.org/index.php

Cryptocurrency Trading Charts

https://coinmarketcap.com/

Most Profitable Mining Calculations

http://www.coinwarz.com/cryptocurrency

Some Exchanges

https://poloniex.com/

https://btc-e.com/

https://www.gdax.com/

https://www.bittrex.com/

 

 

 

 

 

I’m not a DBA, But I Play One on TV: Part 2 – CPU and RAM

In Part 1 I discussed SQL Server and Hard Disk configurations. Now let’s have a look at CPU and RAM. This topic is actually kind of easy. More is better… most of the time.

CPU

It’s my opinion that most development environments should have a minimum of 4, 2.5+ GHz Processors, If that’s one socket with two cores, or one socket with 4 cores or, or two sockets with 2 cores, doesn’t really make that much of a difference. For a low utilization production system you’ll need 8, 2.5+ GHz processors. Look, you can get this level of chip in a mid-high grade laptop. Now if you’re looking at a very high utilization system it’s time to think about 16 processors or 32 split up over 2 or more sockets. Once you get to the land of 32 processors advanced SQL Server configuration knowledge is required. In particular you will need to know how to tweak the MAXDOP (Maximum Degree of Parallelism) setting.

Here’s a great read for setting a query hint: http://blog.sqlauthority.com/2010/03/15/sql-server-maxdop-settings-to-limit-query-to-run-on-specific-cpu/

And here are instructions for a system wide setting: http://technet.microsoft.com/en-us/library/ms189094(v=sql.105).aspx

What does this setting do? It controls the number of parallel processes SQL Server will use when servicing your queries. So why don’t we want SQL Server to maximize the number of parallel processes all the time? There is another engine involved in the process that is responsible for determining which processes can and cannot be done in parallel and the order of the parallel batches. In a very highly utilized SQL Server environment this engine can get bogged down. Think of it like air traffic control at a large airport… but there’s only one controller in the tower and it’s Thanksgiving the biggest air travel holiday in the US. Well the one air traffic controller has to assign the runway for every plane coming in and going out. Obviously, he/she becomes the bottleneck for the whole airport. If this individual only had one or two runways to work with, they wouldn’t be the bottleneck; the airport architecture is the bottleneck. I have seen 32 processor systems grind to a halt with MAXDOP set at 0 because the parallelism rule processing system was overwhelmed.

For more information on the parallel processing process: http://technet.microsoft.com/en-us/library/ms178065(v=sql.105).aspx

RAM

RAM is always a “more is better” situation. Keep in mind that if you don’t set the size and location of the page file manually, the O/S is going to try and take 1.5 times of the RAM from the O/S hard drive. The more RAM on the system, the less often the O/S will have to utilize the much slower page file. For a development system 8GB will probably be fine, but now a days you can get a mid-high level Laptop with 16GB even 32GB is getting pretty cheap. For production 16GB is the minimum, but I’d really urge you to get 24GB. And like I said 32GB configurations are becoming very affordable.

To Proc or Not to Proc

I’ve had some interesting conversations and fun arguments about how to author queries for SQL Server Report Services (SSRS) reports. There are a lot of professionals out there who really want hard fast answers on best practices. The challenge with SSRS is the multitude of configurations available for the system. Is everything (Database Engine, SSAS, SSRS, and SSIS) on one box? Is every service on a dedicated box? Is SSRS integrated with a SharePoint cluster? Where are the hardware investments made in the implementation?

Those are a lot of variables to try and make universal best practices for. Lucky for us Microsoft provided a tool to help troubleshoot report performance. Within the Report Server database there is a view called ExecutionLog3. ExecutionLog3 links together various logging tables in the Report Server database. Here are some of the more helpful columns exposed.

  •          ItemPath – The path and report names that was executed in this record.
  •          UserName – The User the report was ran as.
  •          Format – Format the report was rendered in (PDF, CSV, HTML4.0, etc.)?
  •          Parameters – Prompt selections made.
  •          TimeStart – Server local date and time the prport was executed.
  •          TimeEnd – Server local date and time the report finished rendering.
  •          TimeDataRetrieval – Amount of time in milliseconds to get report data from data source.
  •          TimeProcessing – Amount of time in milliseconds SSRS took to process the results.
  •          TimeRendering – Amount of time in milliseconds Required to produce the final output (PDF, CSV, HTML4.0, etc.)
  •          Status – Succeeded, Failed, Aborted, etc.

I always provide two reports based on the information found in this view. The first report utilizes the time columns to give me insight into how the reports are performing and when the systems peaks utilization. The second report focuses on which users are using what reports to gauge the effectiveness of the reports to the audience.

Generally I’m a big fan for stored procedures, mostly because my reports are usually related to a common data source and stored procedures provide me with a lot of code reuse. Standardizing, the report prompt behavior with stored procedures is also a handy tool. A simple query change can cascade to all the reports that use a stored procedure, alleviating the need to open each report and perform the same change. Additionally, I like to order the result sets in SQL not after the data is returned to the report. But that doesn’t mean that you’re not going to find better performance moving some functionality between tiers based on the results you find in ExecutionLog3.

I’m sorry there just isn’t a one size fits all recommendation for how SSRS reports are structured. Which means; 1 you’ll have to do some research on your configuration, and 2 don’t accept a consultant’s dogma on the topic.

How are you coming with those TPS reports?

Does anyone remember the original “Weekend at Bernie’s”? When the two accountants are pouring over the green and white dot matrix printouts of the accounts on the hot tar roof of their apartment building? That’s the traditional report, pages and pages of numbers. Until the invention of spreadsheets, this was the means by which accountants reviewed the accounts. Larger companies have since outgrown even spreadsheets and demanded larger data storage, like databases. However a majority of the reporting provided from these robust data stores still looks like a spreadsheet.

Detailed row data has its uses. Financial transactions and system audit logs are very useful when displayed as uniform rows of data for visual scanning. You can easily find the row that doesn’t look like the others when searching for an error, but how easy is it to determine transaction volume, or the frequency of a particular event? Are you going to count the lines and keep a tick mark tally on another sheet? You can calculate some of these statistics and group them by date, and compare the groups if all the data is still available at the source. Hopefully the query doesn’t slow down the system while users are trying to do their work on it. Save the data in monthly spreadsheets that are backed up regularly? In most cases, the generation of these reports just becomes a meaningless process and waste of paper.

Business Intelligence (BI), I don’t know who coined the term, is meant to communicate the difference between a report (any formatted delivery of data) and the display of information in a way that aides in the business decision making process. BI reporting answers questions like how are this month’s sales compared to last month’s? Or has there been a statistically significant increase in defects with the new modifications to our product?

Many professionals familiar with BI reporting make the assumption that it’s really only applicable to data collected and aggregated over a large period of time. Contact center management is the best example of why this isn’t the case. A contact center is much like an old Amateur Radio that requires constant tuning to produce the best receiving and transmitting signals. These machines come with a panel full of dials and switches used to make sure the radio and the antenna are in perfect attunement. Similarly, contact center managers are constantly monitoring the call handle and queue times making sure the correct proportion of agents are staffed for email, voice, or chat processing. These managers require timely 15 or 30 minute latent reports to determine short term staffing levels. Most companies see the customer service departments as necessary expenses to keep their customers happy. Decision makers need nearly real-time information to make constant adjustments maximizing the efficiency of the staff and keeping their customers happy.

The challenge for BI professionals is, understanding the users’ needs well enough to deliver the correct solution for the need. There isn’t a one size fits all approach to BI delivery. The assembly manager needs metrics on how many completed plastic toys are failing inspection every half hour. Management needs to compare this month’s inspection failures to the samples before switching to the new vendor, perhaps a few times a week. The executive might want to know how sales are going this year compared to the last five, but she only needs this information on the first of the month when she first walks into the office. Each one of these examples has different requirements for the size of the data set, the amount of time the report needs to be displayed for, and the near or distant data term period access.

What’s the point? Go run a search on any technology job board for Business Intelligence or BI. Employers are looking for qualified BI professionals to deliver reporting solutions way that aide in the business decision making process. It’s a growing space/niche on par with security and mobile development. If you can get past the stigma placed on this practice by developers that “Reporting Work” is somehow inferior to software development, there is a lot of opportunity to be had.