Re: XSS Test (Score: 3, Informative)

by in S & P sets Tesla's credit rating to B- on 2014-05-29 15:16 (#1YN)

I'll file a bug report.

Re: XSS Test: <script>alert(document.cookie);</script> (Score: 0)

by in S & P sets Tesla's credit rating to B- on 2014-05-29 15:15 (#1YM)

Sorry everyone, I saw a post on this page about XSS vulnerabilities in the title of posts, so I went ahead and submitted a benign example. It's ruining the page for everyone, but does illustrate the vulnerability quite nicely.

Re: Thanks (Score: 2, Interesting)

by in Pipecode source released on 2014-05-07 11:02 (#1EV)

PS: Sorry to self-reply, just something to keep in mind, is that I've been following Bryan(#1)'s comments about the site, and as I understand it there is a bit more to pipedot than meets the eye. He is talking about the codebase being able to connect to other sites running the codebase, to distribute stories and comments, and topics. Such an idea is really exciting as I think it fundamentally defuses the issue of monetisation. On top of this the source is open.

So perhaps in future what we will see is a plethora of sites running pipecode, each with a particular focus or topic. They are all linked, so a user can tweak what they see, or what is brought to their attention.

None of the sites would be particularly large, so the site/system can't be "bought out" by commercial interests.

The drawbacks would be around whether the sites are able to grow large enough to establish communities of their own. This will be the absolute kicker, since what makes communities form is difficult to understand, and even harder to control (look at "failed" efforts like G+) so really this is just going to have to be a "wait and see" situation.

It's a great little/big experiment and I am looking forward to following along. I don't know how many of the comments Bryan gets a chance to read, but I'm sure he'll post a follow up here if my musing/thoughts are absolutely wrong: I'm just responding to other posts I've seen him make, and I may have misunderstood things.

Re: Thanks (Score: 1)

by in Pipecode source released on 2014-05-07 10:53 (#1ET)

I know it's not easy, and I appreciate that you were brainstorming, as am I...

If the objective is to cover server and modest staffing costs then non-intrusive text-based ads with a subscription model to remove them does seem like the tried and true model to follow. Of course I'm no expert, so perhaps the revenue stream from such ads is insufficient.

Certainly, what I see on the internet at large is that when a high volume site (such as Slashdot) resorts to highly obstrusive, inline video/auto play ads, and other such BS, then it's a sign that the owners are trying to squeeze as much money as they can from the platform. So for their needs, text-only ads probably isn't "enough" money to make them happy.

If PipeDot is about building a self-driven community then I think those objectives will be fundamentally opposed to any major effort to monetize. Whether through obtrusive ads, or what I would call "agressive" subscription models, like the one you suggested. And again, what you suggested isn't bad, I just think it goes against the spirit of a community site that's built on community contributions.

At the end of the day (and I'm still brainstorming) the world seems rather topsy-turvy to me these days. With Facebook paying so much money for WhatsApp ( I can't possibly fathom how a user of an instant messaging program can be priced at $35..

So I'm sorry zafiro17 (and other readers) I've no solutions in hand...

Re: Thanks (Score: 3, Interesting)

by in Pipecode source released on 2014-05-06 09:46 (#1E1)

"Paid subscribers get to see more than the first 20 comments?"

There are fundamental problems with a payment/monetisation strategy such as this one, which seeks to directly derive payment from others for the posting content of the contributors to the site.

Of course any monetisation of the site is trying to do the same thing, but it's a little less direct than outright charging people to read what others have written.

So far I've trusted Brian to put together a very tidy site here, and I actually trust him to make some decisions about how to monetise, if indeed that's a direction he wants to go in.

Re: Impressive rate of progress (Score: 4, Insightful)

by in Weekly Update on 2014-04-15 09:00 (#12K)

I disagree, Bryan is doing an excellent job of creating a robust, [mostly] feature-complete site. I have no understanding of his future intentions, but could see a community here flourishing once he has the code in a place he is happy with.

You may not realise this but the unwashed masses are a fickle bunch.

Call them in too soon and they'll turn up their noses and never come back.

Re: Usenet (Score: 0)

by in Temporarily Offline? on 2014-03-12 12:12 (#H3)

Nice one Bryan, you're doing a great job with the site.

My own initial thoughts were: why not default to a non-JS page (flat html) and then use JS (if enabled) to "roll-up" the comments into the collapsed form. This way it works well for everyone.

But I realised after that my own ideas were a bit naive. If every visitor downloads all of the comments (as flat html) then it probably adds considerably to the load and cost of running the server. By using Javascript you are selectively reducing the amount loaded by most (non-logged in) users. So I did some more thinking and realised that it actually depends on how many comments there are versus how much Javsacript code there is:

- There are three basic ways to handle a page: flat html, javascript on top of flat html, javascript and ajax

- Logged in users can have preferences which are honoured, but it is also possible to implement similar behaviour for non-logged in users with cookies (ah ha, finally, they are good for something other than tracking) but you'd have to decide if that's worth the hassle.

- The draw back with ajax is that for pages with lots of comments (hundreds) then you've just taken your dozen connections (to get one page) and turned this into hundreds of connections, all hammering at the server. I'm imagining you want some kind of clever algorithm to allow you to balance the data delivered with the initial page load, and that with ajax requests afterwards. Have you considered encapsulating post data in the page, for laying out by the javascript at rendering time? This would relieve the load on the web server, but would make the DB server work much harder since each page load will still want to retrieve all posts from the DB. I was thinking they can be put in some JSON format and then gzipped and inserted into the page code at load time. To strike a balance between initial page load and ajax load you can specify the rules on what gets included with the initial load (all posts, first 100 posts, all +3 and higher posts, etc...). But those rules don't in any way affect what the user sees, just what content is sent on the initial load. Once in the browser then ajax takes over and will selectively load whatever remaining posts the user wants to look at. My own web work has lead me to believe that to keep pages fast and responsive you must reduce the number of connections to the server. My own sites use custom server side aggregation code that glues all my javascripts and css files together into one include, and I always use the css sprites hacks to ensure only one image file is loaded per page. I get much much much better performance this way, but on the other hand I'm working with much lighter load compared to what you must be serving here.

Anyway mate, I'm sure you've spent a *lot* longer thinking about these issues than I have, this was just my $0.02 (or less) worth.

Keep up the good work, your page is brilliant!